Amazing Science
820.1K views | +207 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Launching in 2015: A Certificate Authority to Encrypt the Entire Web

Launching in 2015: A Certificate Authority to Encrypt the Entire Web | Amazing Science |

Today EFF is pleased to announce Let’s Encrypt, a new certificate authority (CA) initiative that we have put together with Mozilla, Cisco, Akamai, IdenTrust, and researchers at the University of Michigan that aims to clear the remaining roadblocks to transition the Web from HTTP to HTTPS.

Although the HTTP protocol has been hugely successful, it is inherently insecure. Whenever you use an HTTP website, you are always vulnerable to problems, including account hijacking and identity theft; surveillance and tracking by governmentscompanies, and both in concert; injection of malicious scripts into pages; and censorship that targets specific keywords orspecific pages on sites. The HTTPS protocol, though it is not yet flawless, is a vast improvement on all of these fronts, and we need to move to a future where every website is HTTPS by default.With a launch scheduled for summer 2015, the Let’s Encrypt CA will automatically issue and manage free certificates for any website that needs them. Switching a webserver from HTTP to HTTPS with this CA will be as easy as issuing one command, or clicking one button.

The biggest obstacle to HTTPS deployment has been the complexity, bureaucracy, and cost of the certificates that HTTPS requires. We’re all familiar with the warnings and error messages produced by misconfigured certificates. These warnings are a hint that HTTPS (and other uses of TLS/SSL) is dependent on a horrifyingly complex and often structurally dysfunctional bureaucracy for authentication.

The need to obtain, install, and manage certificates from that bureaucracy is the largest reason that sites keep using HTTP instead of HTTPS. In our tests, it typically takes a web developer 1-3 hours to enable encryption for the first time. The Let’s Encrypt project is aiming to fix that by reducing setup time to 20-30 seconds. You can help test and hack on the developer preview of our Let's Encrypt agent software.

Let’s Encrypt will employ a number of new technologies to manage secure automated verification of domains and issuance of certificates. We will use a protocol we’re developing called ACME between web servers and the CA, which includes support for new and stronger forms of domain validation. We will also employ Internet-wide datasets of certificates, such as EFF’s own Decentralized SSL Observatory, the University of Michigan’s, and Google'sCertificate Transparency logs, to make higher-security decisions about when a certificate is safe to issue.

The Let’s Encrypt CA will be operated by a new non-profit organization called the Internet Security Research Group (ISRG). EFF helped to put together this initiative with Mozilla and the University of Michigan, and it has been joined for launch by partners including Cisco, Akamai, and Identrust.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Genomically encoded analog memory with precise in vivo DNA writing using living bacteria populations

Genomically encoded analog memory with precise in vivo DNA writing using living bacteria populations | Amazing Science |

MIT engineers have transformed the genome of the bacterium E. coli into a long-term storage device for memory. They envision that this stable, erasable, and easy-to-retrieve memory will be well suited for applications such as sensors for environmental and medical monitoring.“You can store very long-term information,” says Timothy Lu, an associate professor of electrical engineering and computer science and biological engineering. “You could imagine having this system in a bacterium that lives in your gut, or environmental bacteria. You could put this out for days or months, and then come back later and see what happened at a quantitative level.

”The new strategy, described in the Nov. 13, 2014 issue of the journal Science ("Genomically encoded analog memory with precise in vivo DNA writing in living cell populations"), overcomes several limitations of existing methods for storing memory in bacterial genomes, says Lu, the paper’s senior author. Those methods require a large number of genetic regulatory elements, limiting the amount of information that can be stored.The earlier efforts are also limited to digital memory, meaning that they can record only all-or-nothing memories, such as whether a particular event occurred. Lu and graduate student Fahim Farzadfard, the paper’s lead author, set out to create a system for storing analog memory, which can reveal how much exposure there was, or how long it lasted. To achieve that, they designed a “genomic tape recorder” that lets researchers write new information into any bacterial DNA sequence.

The researchers showed that SCRIBE enables the recording of arbitrary transcriptional inputs into DNA storage registers in living cells by translating regulatory signals into ssDNAs. In E. coli, they expressed ssDNAs from engineered retrons that use a reverse transcriptase protein to produce hybrid RNA-ssDNA molecules. These intracellularly expressed ssDNAs are targeted into specific genomic loci where they are recombined and converted into permanent memory. The team could show that genomically stored information can be readily reprogrammed by changing the ssDNA template and controlled via both chemical and light inputs. This demonstrates that genomically encoded memory can be read with a variety of techniques, including reporter genes, functional assays, and high-throughput DNA sequencing.

SCRIBE enables the recording of analog information such as the magnitude and time span of exposure to an input. This convenient feature is facilitated by the intermediate recombination rate of our current system (~10–4 recombination events per generation), which we validated via a mathematical model and computer simulations. For example, the scientists stored the overall exposure time to chemical inducers in the DNA memory of bacterial populations for 12 days (~120 generations), independently of the induction pattern. The frequency of mutants in these populations was linearly related to the total exposure time. Furthermore, they were able to demonstrate that SCRIBE-induced mutations can be written and erased and can be used to record multiple inputs across the distributed genomic DNA of bacterial populations.

Finally, they could show that SCRIBE memory can be decomposed into independent “input,” “write,” and “read” operations and used to create genetic “logic-and-memory” circuits, as well as “sample-and-hold” circuits.

Conclusion: SCRIBE is a scalable platform that uses genomic DNA for analog, rewritable, and flexible memory distributed across living cell populations. The scientists anticipate that SCRIBE will enable long-term cellular recorders for environmental and biomedical applications. Future optimization of recombination efficiencies achievable by SCRIBE could lead to more efficient single-cell digital memories and enhanced genome engineering technologies. Furthermore, the ability to regulate the generation of arbitrary targeted mutations with other gene-editing technologies should enable genomically encoded memory in additional organisms.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Billions of nanoreactors inform about design structure of materials

Billions of nanoreactors inform about design structure of materials | Amazing Science |

Imagine building a chemical reactor small enough to study nanoparticles a billionth of a meter across. A billion times smaller than a raindrop is the volume of an E. coli cell. And another million times smaller would be a reactor small enough to study isolated nanoparticles. Add to that the challenge of making not just one of these tiny reactors, but billions of them, all identical in size and shape. Researchers at Cornell have done just that. A team led by Tobias Hanrath, associate professor of chemical and biomolecular engineering, has demonstrated controlled fusion of semiconductor quantum dots within a nanoreactor cage of rusty particles.

The team arranged six lead selenide crystals within a framework of iron oxide (rust) spheres. They studied how the quantum dots within the nanoscale “rusty cage” interact, using X-rays at the Cornell High Energy Synchrotron Source (CHESS). These experiments allowed them to pinpoint specific interactions between particles in the box and thus pave the way for making novel materials with properties by design. The results, which could be applied to other materials, were published in Scientific Reports ("Connecting the Particles in the Box – Controlled Fusion of Hexamer Nanocrystal Clusters within an AB6 Binary Nanocrystal Superlattice").

They used CHESS to perform X-ray scattering on repeating units of these rusty boxes as they heated them up, watching what happens to the lead selenide in the center. With the scattering data acting like a high-definition movie, they could identify different stages of fusion of the lead selenide hexamers. This could lead to insight into getting specific functionalities out of these little-understood nanomaterials. Too much heat made the lead crystals sinter and fuse; not enough heat didn’t pull them close enough together to interact.

Graduate student Ben Treml led the experiments; he synthesized the particles and assembled them into superlattices (lattices of nanocrystals, rather than atoms). The samples were studied at the D1 beam line of CHESS with co-author Detlef Smilgies, staff scientist, who helped Treml refine the experiments. The results were verified with theoretical modeling by co-authors Paulette Clancy, professor of chemical and biomolecular engineering, and postdoctoral associate Binit Lukose.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Philae lander detects organic molecules on surface of comet

Philae lander detects organic molecules on surface of comet | Amazing Science |

Philae spacecraft beams back evidence of carbon and hydrogen that could provide clues about origins of life on Earth. Although scientists are still to reveal what kind of molecules have been found on comet 67P/Churyumov-Gerasimenko, the discovery could provide new clues about how the early chemical ingredients that led to life on Earth arrived on the planet. “We currently have no information on the quantity and weight of the soil sample,” said Fred Goesmann principal investigator on the Cosac instrument at the Max Planck Institute for Solar System Research.

Many scientists believe they may have been carried here on an asteroid or comet that collided with the Earth during its early history.

The DLR German Aerospace Centre, which built the Cosac instrument, confirmed it had found organic molecules.

It said in a statement: “Cosac was able to ‘sniff’ the atmosphere and detect the first organic molecules after landing. Analysis of the spectra and the identification of the molecules are continuing.”

The compounds were picked up by the instrument, which is designed to “sniff” the comet’s thin atmosphere, shortly before the lander was powered down. It is believed that attempts to analyse soil drilled from the comet’s surface with Cosac were not successful.

Philae was able to work for more than 60 hours on the comet, which is more than 500m miles from Earth, before entering hibernation.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

E-cigarettes significantly reduce tobacco cravings with only minimal side effects, study finds

E-cigarettes significantly reduce tobacco cravings with only minimal side effects, study finds | Amazing Science |

Electronic cigarettes offer smokers a realistic way to kick their tobacco smoking addiction. In a new study published in the International Journal of Environmental Research and Public Health, scientists at KU Leuven report that e-cigarettes successfully reduced cravings for tobacco cigarettes, with only minimal side effects.

Electronic cigarettes (e-cigs) were developed as a less harmful alternative to tobacco cigarettes. They contain 100 to 1,000 times less toxic substances and emulate the experience of smoking a tobacco cigarette.

In an 8-month study, the KU Leuven researchers examined the effect of using e-cigs (“vaping”) in 48 participants, all of whom were smokers with no intention to quit. The researchers’ goal was to evaluate whether e-cigs decreased the urge to smoke tobacco cigarettes in the short term, and whether e-cigs helped people stop smoking altogether in the long-term.

The participants were divided into three groups: two e-cig groups, which were allowed to vape and smoke tobacco cigarettes for the first two months of the study, and a control group that only had access to tobacco. In a second phase of the study, the control group was given e-cigs and all participants were monitored for a period of six months via a web tool, where they regularly logged their vaping and smoking habits. 

In the lab, the e-cigs proved to be just as effective in suppressing the craving for a smoke as tobacco cigarettes were, while the amount of exhaled carbon monoxide  remained at baseline levels. In the long-term analysis, results showed that the smokers were more likely to trade in their tobacco cigarettes for e-cigs and taper off their tobacco use.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Alien Life Could Thrive on 'Supercritical' CO2 Instead of Water

Alien Life Could Thrive on 'Supercritical' CO2 Instead of Water | Amazing Science |

Alien life might flourish on an exotic kind of carbon dioxide, researchers say. This "supercritical" carbon dioxide, which has features of both liquids and gases, could be key to extraterrestrial organisms much as water is to biology on Earth.

Most familiar as a greenhouse gas that traps heat, helping warm the planet, carbon dioxide is exhaled by animals and used by plants in photosynthesis. While it can exist as a solid, liquid and gas, past a critical point of combined temperature and pressure, carbon dioxide can enter a "supercritical" state. Such a supercritical fluid has properties of both liquids and gases. For example, it can dissolve materials like a liquid, but flow like a gas.

The critical point for carbon dioxide is about 88 degrees Fahrenheit (31 degrees Celsius) and about 73 times Earth's atmospheric pressure at sea level. This is about equal in pressure to that found nearly a half-mile (0.8 kilometers) under the ocean's surface. Supercritical carbon dioxide is increasingly used in a variety of applications, such as decaffeinating coffee beans and dry cleaning.

Ordinarily, carbon dioxide is not considered a viable solvent to host the chemical reactions for life, but the properties ofsupercritical fluids can differ quite significantly from the regular versions of those fluids — for instance, while regular water is not acid, supercritical water is acidic. Given how substantially different supercritical carbon dioxide is from regular carbon dioxide in terms of physical and chemical properties, scientists explored whether it could be suitable for life.

"I always have been interested in possibly exotic life and creative adaptations of organisms to extreme environments," said study co-author Dirk Schulze-Makuch, an astrobiologist at Washington State University in Pullman. "Supercritical CO2 is often overlooked, so I felt that someone had to put together something on its biological potential."

The researchers noted that enzymes can be more stable in supercritical carbon dioxide than in water. In addition, supercritical carbon dioxide makes enzymes more specific about the molecules they bind to, leading to fewer unnecessary side reactions.

Surprisingly, a number of species of bacteria are tolerant of supercritical carbon dioxide. Prior research found that several different microbial species and their enzymes are active in the fluid.

In addition, exotic locales on Earth support the idea that life can survive in environments rich in carbon dioxide. Previous studies showed that microbes can live near pockets of liquid carbon dioxide trapped under Earth's oceans.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

AI breaking ground: building a natural description of images

AI breaking ground: building a natural description of images | Amazing Science |

People can summarize a complex scene in a few words without thinking twice. It’s much more difficult for computers. But we’ve just gotten a bit closer -- we’ve developed a machine-learning system that can automatically produce captions (like the three above) to accurately describe images the first time it sees them. This kind of system could eventually help visually impaired people understand pictures, provide alternate text for images in parts of the world where mobile connections are slow, and make it easier for everyone to search on Google for images.

Recent research has greatly improved object detection, classification, and labeling. But accurately describing a complex scene requires a deeper representation of what’s going on in the scene, capturing how the various objects relate to one another and translating it all into natural-sounding language.

Many efforts to construct computer-generated natural descriptions of images propose combining current state-of-the-art techniques in both computer vision and natural language processing to form a complete image description approach. But what if we instead merged recent computer vision and language models into a single jointly trained system, taking an image and directly producing a human readable sequence of words to describe it?

This idea comes from recent advances in machine translation between languages, where a Recurrent Neural Network (RNN) transforms, say, a French sentence into a vector representation, and a second RNN uses that vector representation to generate a target sentence in German.

Now, what if we replaced that first RNN and its input words with a deep Convolutional Neural Network (CNN) trained to classify objects in images? Normally, the CNN’s last layer is used in a final Softmax among known classes of objects, assigning a probability that each object might be in the image. But if we remove that final layer, we can instead feed the CNN’s rich encoding of the image into a RNN designed to produce phrases. We can then train the whole system directly on images and their captions, so it maximizes the likelihood that descriptions it produces best match the training descriptions for each image.

Natural Language Careers's curator insight, November 19, 2014 8:53 AM

Google making progress towards automatic captioning.  Cool stuff.

Scooped by Dr. Stefan Gruenwald!

Study Shows That Bed Bugs Can Transmit Parasite that Causes Chagas Disease

Study Shows That Bed Bugs Can Transmit Parasite that Causes Chagas Disease | Amazing Science |

The bed bug may be just as dangerous as its sinister cousin, the triatomine, or “kissing” bug. A new study from Penn Medicine researchers in the Center for Clinical Epidemiology and Biostatistics demonstrated that bed bugs, like the triatomines, can transmit Trypanosoma cruzi, the parasite that causes Chagas disease, one of the most prevalent and deadly diseases in the Americas.

The role of the bloodsucking triatomine bugs as vectors of Chagas disease—which affects 6 to 8 million worldwide, mostly in Latin America, and kills about 50,000 a year—has long been recognized. The insects infect people not through their bite but feces, which they deposit on their sleeping host, often around the face, after feeding. Bed bugs, on the other hand, are usually considered disease-free nuisances whose victims are left with only itchy welts from bites and sleepless nights.

In a study published online this week in the American Journal of Tropical Medicine and Hygiene, senior author Michael Z. Levy, PhD, assistant professor in the department of Biostatistics and Epidemiology at the University of Pennsylvania’s Perelman School of Medicine, and researchers at the Universidad Peruana Cayetano Heredia in Peru conducted a series of laboratory experiments that demonstrated bi-directional transmission of T. cruzi between mice and bed bugs.

In the first experiment run at the Zoonotic Disease Research Center in Arequipa, Peru, the researchers exposed 10 mice infected with the parasite to 20 uninfected bed bugs every three days for a month. Of about 2,000 bed bugs used in the experiment, the majority acquired T. cruzi after feeding on the mice.  In a separate experiment to test transmission from bug to mouse, they found that 9 out of 12 (75 percent) uninfected mice acquired the parasite after each one lived for 30 days with 20 infected bed bugs. 

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Major brain pathway rediscovered

Major brain pathway rediscovered | Amazing Science |

A team of neuroscientists in America say they have rediscovered an important neural pathway that was first described in the late nineteenth century but then mysteriously disappeared from the scientific literature until very recently. In a study published today in Proceedings of the National Academy of Sciences, they confirm that the prominent white matter tract is present in the human brain, and argue that it plays an important and unique role in the processing of visual information. The vertical occipital fasciculus (VOF) is a large flat bundle of nerve fibres that forms long-range connections between sub-regions of the visual system at the back of the brain. It was originally discovered by the German neurologist Carl Wernicke, who had by then published his classic studies of stroke patients withlanguage deficits, and was studying neuroanatomy in Theodor Maynert’s laboratory at the University of Vienna. Wernicke saw the VOF in slices of monkey brain, and included it in his 1881 brain atlas, naming it the senkrechte occipitalbündel, or ‘vertical occipital bundle’.

Maynert - himself a pioneering neuroanatomist and psychiatrist, whose other students included Sigmund Freud and Sergei Korsakov - refused to accept Wernicke’s discovery, however. He had already described the brain’s white matter tracts, and had arrived at the general principle that they are oriented horizontally, running mostly from front to back within each hemisphere. But the pathway Wernicke had described ran vertically. Another of Maynert’s students, Heinrich Obersteiner, identified the VOF in the human brain, and mentioned it in his 1888 textbook, calling it the senkrechteoccipitalbündel in one illustration, and the fasciculus occipitalis perpendicularis in another. So, too, did Heinrich Sachs, a student of Wernicke’s, who labeled it thestratum profundum convexitatis in his 1892 white matter atlas.

The VOF appeared again in a number of other textbooks in the following decades, including the 1918 edition of Gray’s Anatomy, but eventually fell into obscurity. This may have been due to early confusion over the nomenclature; to Maynert, who remained influential but refused to acknowledge Wernicke’s discovery up until his death in 1892; and to changes in neuroanatomical methods, which gradually moved from brain dissections that exposed the white matter tracts in large part, to brain tissue slices, which did not. Jason Yeatman and his colleagues at Stanford University came across the VOF by chance several years ago. They have been visualizing the brain’s long-range connections using state-of-the-art neuroimaging techniques, in order to investigate the neural mechanisms underlying language processing and reading, and in 2012, reported that the growth pattern of the white matter tracts predicts how a child’s reading skills will develop over time.

“I stumbled upon it while studying the visual word form area,” says Yeatman. “In every subject, I found this large, vertically-oriented fibre bundle terminating in that region of the brain.” He searched for it in the literature, but found no mention of it, so his then Ph.D. supervisor sent the scans to colleagues in the neurosurgery department.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The last chapter in Rosetta's Philae lander story, at least for now

The last chapter in Rosetta's Philae lander story, at least for now | Amazing Science |

Last week was exciting and exhausting for anyone involved in space exploration and astronomy, after scientists working on the Rosetta mission of the European Space Agency (ESA) made history when their “Philae” module touched down safely on the surface of comet 67P/Churyumov–Gerasimenko. But soon after celebrating Philae’s successful landing, a dramatic story unfolded. With a bumpy triple landing, harpoons that did not fire and tether the probe, as well as a final resting-spot that lay in the shadows, which meant its solar panels received very little sunlight, Philae’s tumultuous story captivated the interest of thousands of people across the globe.

In the early hours of Saturday morning, as Philae’s batteries slowly drained of power, thousands mourned. “So much hard work..getting tired… my battery voltage is approaching the limit soon now,” Tweeted the Philae crew, and yet, the lander’s story was ultimately happy and successful. Although it spent only 57 “active” hours on the comet ESA mission scientists were happy to report that the lander indeed completed the entirety of its primary science mission.

According to a Rosetta mission update, the lander “returned all of its housekeeping data, as well as science data from the targeted instruments, including ROLIS, COSAC, Ptolemy, SD2 and CONSERT. This completed the measurements planned for the final block of experiments on the surface.” This achievement means that the relatively lightweight Philae lander actually managed to drill into the comet using the SD2 instrument, despite not being screwed in to the surface, and retrieve a sample – a formidable and risky task. The Rosetta team also managed to move the lander – lifting its body by about 4 cm and rotating it by about 35° to try to get some more solar energy – although as the last science data fed back to Earth, Philae’s power rapidly depleted.

“It has been a huge success, the whole team is delighted,” said Stephan Ulamec, lander manager at the DLR German Aerospace Agency.  “Despite the unplanned series of three touchdowns, all of our instruments could be operated and now it’s time to see what we’ve got.” Members of the Rosetta team now have heaps of science data to trawl through for new findings, and they are still searching through the many high-resolution images from the orbiter for Philae’s still unknown final resting spot.  While descent images show that the surface of the comet is covered by dust and debris ranging in size from millimetres to metres, panoramic images taken by Philae tell a slightly different tale – you can see layered walls of harder-looking material.  “We still hope that at a later stage of the mission, perhaps when we are nearer to the Sun, that we might have enough solar illumination to wake up the lander and re-establish communication, ” added Ulamec.

One of the interesting bits of science to have already emergde from Philae’s experiments came from the Multi-Purpose Sensors for Surface and Subsurface Science (MUPUS) instrument that was activated on Friday. MUPUS has a hammer of sorts that intended to smash through 67P’s surface and look at what lies beneath. Interestingly, however, the device kept hammering away at the surface with all its might for nearly seven minutes to no avail – despite being ramped up by the scientists to what they describe as “a secret power setting 4, nicknamed ‘desperate mode’“. In fact, the hammer eventually gave up and failed.

This observation is intriguing as everything we thought we knew about the comet suggested that its surface was not quite that hard. Indeed, the MUPUS team jokingly tweeted that “MUPUS performed beautifully inside the specifications. The comet failed to cooperate.

Philae is now “hibernating” and will continue to do so until more sunlight falls on its solar panels, which may happen in August next year as the comet approaches the Sun and becomes much more active. Until then, the main Rosetta orbiter will remain in orbit around the comet and continue its mission to study the body in detail as the comet becomes more active. “At the end of this amazing rollercoaster week, we look back on a successful first-ever soft-landing on a comet. This was a truly historic moment for ESA and its partners,” said Fred Jansen, Rosetta mission manager. “We now look forward to many more months of exciting Rosetta science and possibly a return of Philae from hibernation at some point in time.”

While Philae snoozes on the alien world that is comet 67P, I can’t help but notice just how attached people have become to the lander over the last week. Maybe it was fuelled by the rather touching Tweets that Philae and the Rosetta mission sent out, or the sheer challenges involved. But I and many others have anthropomorphised the lander and, for three days, thousands of people worried about the outcome of a scientific experiment nearly as much as the scientists themselves did. We fretted when the scientists in the control room looked worried, we cheered when they received a signal from 511 million kilometres away and we sent little messages of luck and hope to a washing-machine-sized box that set off on a journey 10 years ago. As Jonathan Freedland, writing for The Guardian said “If Philae expires on the hard, rocky surface of Comet 67P the sadness will be felt far beyond mission control in Darmstadt, Germany.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

In 2014 global ocean warming is mostly due to the North Pacific, which has warmed far beyond any recorded value

In 2014 global ocean warming is mostly due to the North Pacific, which has warmed far beyond any recorded value | Amazing Science |

"This summer has seen the highest global mean sea surface temperatures ever recorded since their systematic measuring started. Temperatures even exceed those of the record-breaking 1998 El Niño year," says Axel Timmermann, climate scientist and professor, studying variability of the global climate system at the International Pacific Research Center, University of Hawaii at Manoa.

From 2000-2013 the global ocean surface temperature rise paused, in spite of increasing greenhouse gas concentrations. This period, referred to as the Global Warming Hiatus, raised a lot of public and scientific interest. However, as of April 2014 ocean warming has picked up speed again, according to Timmermann's analysis of ocean temperature datasets.

"The 2014 global ocean warming is mostly due to the North Pacific, which has warmed far beyond any recorded value (Figure 1a) and has shifted hurricane tracks, weakened trade winds, and produced coral bleaching in the Hawaiian Islands," explains Timmermann.

He describes the events leading up to this upswing as follows: Sea-surface temperatures started to rise unusually quickly in the extratropical North Pacific already in January 2014. A few months later, in April and May, westerly winds pushed a huge amount of very warm water usually stored in the western Pacific along the equator to the eastern Pacific. This warm water has spread along the North American Pacific coast, releasing into the atmosphere enormous amounts of heat—heat that had been locked up in the Western tropical Pacific for nearly a decade.

"Record-breaking greenhouse gas concentrations and anomalously weak North Pacific summer trade winds, which usually cool the ocean surface, have contributed further to the rise in sea surface temperatures. The warm temperatures now extend in a wide swath from just north of Papua New Guinea to the Gulf of Alaska," says Timmermann.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Advances in electron microscopy reveal secrets of HIV and other viruses

Advances in electron microscopy reveal secrets of HIV and other viruses | Amazing Science |

UC Davis researchers are getting a new look at the workings of HIV and other viruses thanks to new techniques in electron microscopy developed on campus. The envelope (or Env) protein of HIV is a key target for vaccine makers: it is a key component in RV144, an experimental vaccine that is so far the only candidate to show promise in clinical trials. Also called gp120, the Env protein associates with another protein called gp41 and three gp120/gp41 units associate to form the final trimeric structure. The gp120 trimer is the machine that allows HIV to enter and attack host cells.

Professor R. Holland Cheng’s laboratory at UC Davis has previously shown how the gp120 trimer can change its conformation like an opening flower. The new study, published in Nature Scientific Reports Nov. 14, shows that a variable loop, V2, is located at the bottom of the trimer where it helps to hold gp41 in place — and not at the top of the structure, as previously thought.

New visualization of the V2 variable loop of the HIV Env protein (red) puts it at the bottom of the structure. “This challenges the existing dogma concerning the architecture of HIV Env immunogen,” Cheng said.

Making a vaccine against HIV has always been difficult, at least partly because the proteins on the surface of the virus change so rapidly. Better understanding the structure of the gp120/Env trimer could help in finding less-variable areas of these proteins, not usually exposed to the immune system, which might be targets for a vaccine. 

A second pair of back-to-back papers from Cheng’s lab uses new techniques in electron microscopy to probe how some common viruses hijack normal cellular processes to enter cells. Cheng’s lab has pioneered techniques in cryoelectron microscopy. Traditionally electron microscopy has relied on coating or impregnating samples with heavy metal elements. Cryoelectron microscopy uses extremely low temperatures to freeze biological structures in place instead.

By taking multiple images from slightly different angles and reconstructing them with computers, Cheng has been able to produce three-dimensional images of viruses and virus proteins and particularly, virus-infected cells. However, because of the way electrons are scattered from samples, cryoelectron microscopes can only use a limited range of angles, creating a “missing wedge” in imaging infected cells. In one of the papers recently published in the journal PLOS One, Lassi Paavolainen and colleagues present a new statistical technique to reconstruct this missing data with no prior knowledge of the sample.

In the companion paper, Pan Soonsawad and colleagues applied the new technique to study the vesicles, or small bubbles that form inside cells when a picornavirus enters. The picornaviruses are a large group that includes the viruses that cause colds, gut infections, polio, hepatitis A and the recent outbreaks of contagious hand-foot-mouth disease (HFMD) spread in infants and children of younger age in U.S. this summer.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

IBM developing 150-petaflops supercomputer for national labs

IBM developing 150-petaflops supercomputer for national labs | Amazing Science |

IBM recently announced that the U.S. Department of Energy has awarded IBM contracts valued at $325 million to develop and deliver “the world’s most advanced ‘data-centric’ supercomputing systems” at Lawrence Livermore and Oak Ridge National Laboratories to advance innovation and discovery in science, engineering and national security.”

The world is generating more than 2.5 billion gigabytes of “big data” every day, according to IBM’s 2013 annual report, requiring entirely new approaches to supercomputing. Repeatedly moving data back and forth from storage to processor is unsustainable with the onslaught of Big Data because of the significant amount of time and energy that massive and frequent data movement entails, IBM says, so the emphasis on faster microprocessors becomes progressively more untenable because the computing infrastructure is dominated by data movement and data management.

To address this issue, for the past five years IBM researchers have pioneered a new “data centric” approach — an architecture that embeds compute power everywhere data resides in the system, allowing for a convergence of analytics, modeling, visualization, and simulation, and driving new insights at “incredible” speeds.

IBM says the two Laboratories anticipate that the new IBM OpenPOWER-based supercomputers will be among the “fastest and most energy-efficient” systems, thanks to this data-centric approach. The systems at each laboratory are expected to offer five to 10 times better performance on commercial and high-performance computing applications compared to the current systems at the labs, and will be more than five times more energy efficient.

The “Sierra” supercomputer at Lawrence Livermore and “Summit” supercomputer at Oak Ridge will each have a peak performance of more than 150 petaflops (compared to today’s fastest supercomputer, China’s Tianhe-2, with 33.86 petaflops) with more than five petabytes of dynamic and flash memory to help accelerate the performance of data-centric applications. The systems will also be capable of moving data to the processor, when necessary, at more than 17 petabytes per second (which is equivalent to moving over 100 billion photos on Facebook in a second) to speed time to insights.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Of the difference between mouse and man: A comparative encyclopedia of DNA elements in the mouse genome

Of the difference between mouse and man: A comparative encyclopedia of DNA elements in the mouse genome | Amazing Science |

The laboratory mouse shares the majority of its protein-coding genes with humans, making it the premier model organism in biomedical research, yet the two mammals differ in significant ways. To gain greater insights into both shared and species-specific transcriptional and cellular regulatory programs in the mouse, the Mouse ENCODE Consortium has mapped transcription, DNase I hypersensitivity, transcription factor binding, chromatin modifications and replication domains throughout the mouse genome in diverse cell and tissue types. By comparing with the human genome, the scientists could not only confirm substantial conservation in the newly annotated potential functional sequences, but also find a large degree of divergence of sequences involved in transcriptional regulation, chromatin state and higher order chromatin organization. These results illuminate the wide range of evolutionary forces acting on genes and their regulatory regions, and provide a general resource for research into mammalian biology and mechanisms of human diseases.

Advances in DNA sequencing technologies have led to the development of RNA-seq (RNA sequencing), DNase-seq (DNase I hypersensitive sites sequencing), ChIP-seq (chromatin immunoprecipitation followed by DNA sequencing), and other methods that allow rapid and genome-wide analysis of transcription, replication, chromatin accessibility, chromatin modifications and transcription factor binding in cells11. Using these large-scale approaches, the ENCODE consortium has produced a catalog of potential functional elements in the human genome12. Notably, 62% of the human genome is transcribed in one or more cell types13, and 20% of human DNA is associated with biochemical signatures typical of functional elements, including transcription factor binding, chromatin modification and DNase hypersensitivity. The results support the notion that nucleotides outside the mammalian-conserved genomic regions could contribute to species-specific traits61214.

Now, teams of scientists have applied the same high-throughput approaches to over 100 mouse cell types and tissues15, producing a coordinated group of data sets for annotating the mouse genome. Integrative analyses of these data sets uncovered widespread transcriptional activities, dynamic gene expression and chromatin modification patterns, abundant cis-regulatory elements, and remarkably stable chromosome domains in the mouse genome. The generation of these data sets also allowed an unprecedented level of comparison of genomic features of mouse and human. Described in the current article and companion works, these comparisons reveal both conserved sequence features and widespread divergence in transcription and regulation. Some of the key findings are:

  • Although much conservation exists, the expression profiles of many mouse genes involved in distinct biological pathways show considerable divergence from their human orthologues.
  • A large portion of the cis-regulatory landscape has diverged between mouse and human, although the magnitude of regulatory DNA divergence varies widely between different classes of elements active in different tissue contexts.
  • Mouse and human transcription factor networks are substantially more conserved than cis-regulatory DNA.
  • Species-specific candidate regulatory sequences are significantly enriched for particular classes of repetitive DNA elements.
  • Chromatin state landscape in a cell lineage is relatively stable in both human and mouse.
  • Chromatin domains, interrogated through genome-wide analysis of DNA replication timing, are developmentally stable and evolutionarily conserved.

To annotate potential functional sequences in the mouse genome, the scientists used ChIP-seq, RNA-seq and DNase-seq to profile transcription factor binding, chromatin modification, transcriptome and chromatin accessibility in a collection of 123 mouse cell types and primary tissues (Supplementary Tables). Additionally, to interrogate large-scale chromatin organization across different cell types, they also used a microarray-based technique to generate replication-timing profiles in 18 mouse tissues and cell types (Supplementary Table)16. Altogether, they produced over 1,000 data sets. The list of the data sets and all the supporting material for this manuscript are also available at website of the project:

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Researchers Provide First Peek at How Neurons Multitask

Researchers Provide First Peek at How Neurons Multitask | Amazing Science |

Researchers at the University of Michigan have shown how a single neuron can perform multiple functions in a model organism, illuminating for the first time this fundamental biological mechanism and shedding light on the human brain.

Investigators in the lab of Shawn Xu at the Life Sciences Institute found that a neuron in C. elegans, a tiny worm with a simple nervous system used as a model for studying sensation, movement and other neurological function, regulates both the speed and direction in which the worm moves. The individual neurons can route information through multiple downstream neural circuits, with each circuit controlling a specific behavioral output.

The findings are scheduled for online publication in the journal Cell on Nov. 6. The research is also featured on the cover.

"Understanding how the nervous system and genes lead to behavior is a fundamental question in neuroscience, and we wanted to figure out how C. elegans are able to perform a wide range of complex behaviors with their small nervous systems," Xu said.

The C. elegans nervous system contains 302 neurons.

"Scientists think that even though humans have billions of neurons, some perform multiple functions. Seeing the mechanism in worms will help to understand the human brain," Xu said.

The model neuron studied, AIY, regulates at least two distinct motor outputs: locomotion speed and direction-switch. AIY interacts with two circuits, one that is inhibitory and controls changes in the direction of the worm's movement, and a second that is excitatory and controls speed.

"It's important to note that these two circuits have connections with other neurons and may cross-talk with each other," Xu said. "Neuronal control of behavior is very complex."

Xu is a faculty member in the U-M Life Sciences Institute, where his laboratory is located and research conducted. He is also a professor of molecular and integrative physiology at the U-M Medical School.

Other authors on the paper were Zhaoyu Li, Jie Liu and Maohua Zheng, also of the Life Sciences Institute and Department of Molecular and Integrative Physiology in the U-M Medical School.
The research was supported by the National Institutes of Health.
Shawn Xu: 

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A Man Going Deaf Can Hear Wi-Fi Signals

A Man Going Deaf Can Hear Wi-Fi Signals | Amazing Science |

If you ever hooked up to the Internet before the 2000s, you’ll probably remember that ear-piercing screech emitted by the dial-up modem. These days, the only noise you’ll hear will be the tapping of keys as you punch in the passcode. But not for Frank Swain, the man who can hear Wi-Fi wherever he goes. No, he doesn’t have a rare genetic mutation, but he does have souped-up hearing aids and some very clever software.

Swain has been losing his hearing since his 20s and was fitted with hearing aids two years ago. But his interest did not lie in recreating the soundscape that was gradually fading; he wanted to be able to listen to something that we can’t hear: wireless communication.

To achieve this, science writer Swain buddied up with sound artist Daniel Jones. Using a grant from a UK innovation charity, the duo eventually built Phantom Terrains, a tool that makes Wi-Fi audible. The software, which runs on a hacked iPhone, works by tuning into wireless communication fields. Using the inbuilt Wi-Fi sensor, the software is able to pick up details such as router name, encryption modes and distance from the device.

“The strength of the signal, direction, name and security level on these are translated into an audio stream made up of a foreground and background later: distant signals click and pop like hits on a Geiger counter, while the strongest bleat their network ID in a looped melody,” Swain writes in an essay in New Scientist. “The audio is streamed constantly to a pair of hearing aids. The extra sound layer is blended with the normal output of the hearing aids; it simply becomes part of my soundscape. So long as I carry my phone with me, I will always be able to hear Wi-Fi.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientists X-ray nanoscale organelles of bacteria

Scientists X-ray nanoscale organelles of bacteria | Amazing Science |

An international team of scientists led by Uppsala University has developed a high-throughput method of imaging biological particles using an X-ray laser. Images obtained with this method show projections of the carboxysome particle, a delicate and tiny cell organelle of photosynthetic bacteria. Organelles are tiny structures within the cell responsible for performing specific functions, much as organs within the body are responsible for performing certain functions.

This experiment, described in a paper published in the scientific journal Nature Photonics ("High-throughput imaging of heterogeneous cell organelles with an X-ray laser"), represents a major milestone for studies of individual biological structures using X-ray lasers like the European XFEL that is currently being built from the DESY campus in Hamburg to the neighboring town of Schenefeld. "The technique paves the way for 3D imaging of parts of the cell, and even small viruses, to develop a deeper understanding of life’s machinery", says Uppsala University professor Janos Hajdu, who is one of the lead authors on the paper and an advisor to European XFEL.

To test the method, scientists from Uppsala University, the European XFEL, DESY and a number of other institutions studied the carboxysome, the cell organelle for carbon dioxide assimilation in cyanobacteria. Carboxysomes are responsible for about a third of global carbon fixation. The carboxysome contains protein machinery that incorporates carbon from carbon dioxide into biomolecules and has been studied extensively in Uppsala by Dirk Hasse and Inger Andersson. The carboxysome is a tiny icosahedral structure (a structure with 20 triangle-shaped sides) — of about 100 nanometers in diameter, too small to clearly see with an optical microscope.

Using a specially designed injector that produces a particle stream smaller than the width of a hair, the scientists sprayed an aerosol of carboxysomes across the beam of the LCLS X-ray laser at the SLAC National Accelerator Laboratory in the US.

“The structure of the organelles is determined from the way in which individual carboxysomes scatter the extremely short and ultra bright X-ray flashes of the LCLS”, says DESY scientist Anton Barty, one of the authors of the paper. Uniquely, this new method does not require crystals to get sufficient signal. “Thanks to the extreme brightness of the X-ray laser, which provides X-ray pulses of short enough duration to capture information before the sample explodes, we can reconstruct individual samples without having to crystallize the sample.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

CERN: LHCb experiment observes two new baryon particles never seen before

CERN: LHCb experiment observes two new baryon particles never seen before | Amazing Science |

Today the collaboration for the LHCb experiment at CERN’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b'- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*0, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new Xib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called "spin". In the Xi_b'- state, the spins of the two lighter quarks point in opposite directions, whereas in the Xi_b*- state they are aligned. This difference makes the Xi_b*- a little heavier.

"Nature was kind and gave us two particles for the price of one," said Matthew Charles of the CNRS's LPNHE laboratory at Paris VI University. "The Xi_b'- is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn't have seen it at all using the decay signature that we were looking for.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Nanoparticles that enable both MRI and fluorescent imaging could monitor cancer

Nanoparticles that enable both MRI and fluorescent imaging could monitor cancer | Amazing Science |

MIT chemists have developed new nanoparticles that can simultaneously perform magnetic resonance imaging (MRI) and fluorescent imaging in living animals. Such particles could help scientists to track specific molecules produced in the body, monitor a tumor's environment, or determine whether drugs have successfully reached their targets.

In a paper appearing in the Nov. 18 issue of Nature Communications, the researchers demonstrate the use of the particles, which carry distinct sensors for fluorescence and MRI, to track vitamin C in mice. Wherever there is a high concentration of vitamin C, the particles show a strong fluorescent signal but little MRI contrast. If there is not much vitamin C, a stronger MRI signal is visible but fluorescence is very weak. Future versions of the particles could be designed to detect reactive oxygen species that often correlate with disease, says Jeremiah Johnson, an assistant professor of chemistry at MIT and senior author of the study. They could also be tailored to detect more than one molecule at a time.

Johnson and his colleagues designed the particles so they can be assembled from building blocks made of polymer chains carrying either an organic MRI contrast agent called a nitroxide or a fluorescent molecule called Cy5.5. When mixed together in a desired ratio, these building blocks join to form a specific nanosized structure the authors call a branched bottlebrush polymer. For this study, they created particles in which 99 percent of the chains carry nitroxides, and 1 percent carry Cy5.5.

Nitroxides are reactive molecules that contain a nitrogen atom bound to an oxygen atom with an unpaired electron. Nitroxides suppress Cy5.5's fluorescence, but when the nitroxides encounter a molecule such as vitamin C from which they can grab electrons, they become inactive and Cy5.5 fluoresces.

Nitroxides typically have a very short half-life in living systems, but University of Nebraska chemistry professor Andrzej Rajca, who is also an author of the new Nature Communications paper, recently discovered that their half-life can be extended by attaching two bulky structures to them. Furthermore, the authors of the Nature Communications paper show that incorporation of Rajca's nitroxide in Johnson's branched bottlebrush polymer architectures leads to even greater improvements in the nitroxide lifetime. With these modifications, nitroxides can circulate for several hours in a mouse's bloodstream—long enough to obtain useful MRI images.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Devastating Starfish Disease Seems To Be Caused by Waterborne Densovirus

Devastating Starfish Disease Seems To Be Caused by Waterborne Densovirus | Amazing Science |

The mystery surrounding a gruesome disease that affects starfish on the Pacific coast might finally be put to rest. The starfish wasting syndrome — a real disease that is killing off one of the sea's most iconic invertebrates. While the disease has affected starfish (also known as sea stars) for decades, scientists have long puzzled over what might be causing it. Now, one group of researchers may finally have the answer.

The disease is most likely caused by a virus, according to the researchers, who represent a number of institutions, including Cornell University and the University of California, Santa Cruz. Specifically, the scientists have linked the disease to a densovirus (Parvoviridae), which currently affects at least 20 species of starfish on the Pacific coast of North America.

Starfish wasting disease was first identified in 1979, but since then, no one has been able to pin down a precise cause, according to Pete Raimondi, a professor of ecology and evolutionary biology at UC Santa Cruz and co-author of the new sea star study. Scientists long believed that outbreaks of the disease — which occurred in 1983, 1998 and most recently starting in 2013 — may be linked to environmental stressors, such as spikes in ocean temperature or pollution from shipping lanes and marinas. But while such stressors may have something to do with the rapid spread of sea star wasting syndrome, the researchers now think the underlying cause of the disease is the waterborne densovirus.

"What convinced me that this was an infectious agent was that sea stars that had been in captivity in public aquariums for 30 years suddenly died," said Ian Hewson, an associate professor of microbiology at Cornell and lead author of the study. "There was good evidence that it was something coming in through the intake for the aquariums that wasn't being removed by the sand filtration. And [aquariums] receiving UV-treated water weren't getting sick."

To test this hypothesis, Hewson and his team used a process known as metagenomics, in which genetic material is collected directly from environmental samples and then sequenced in a lab. The researchers collected tissue samples from both healthy starfish and those affected by the wasting disease. They then extracted DNA from these samples and tried to figure out how the healthy tissue differed from the infected tissue. The difference between the two kinds of samples soon became clear: the infected tissue contained a densovirus, Hewson said.

With sea stars in hand, the researchers determined which of the animals were infected with the virus. They then measured how much of the virus was present in the animal's tissue per unit of weight— a measurement known as viral load. Ultimately, they found a significant association between the presence of the disease and the abundance of the viral tissue, according to Hewson. The researchers believe this association supports their hypothesis that the wasting disease is caused by the sea-star associated densovirus.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Astronomers Find Evidence Of Two Undiscovered Trans-Plutonian Planets In Our Solar System

Astronomers Find Evidence Of Two Undiscovered Trans-Plutonian Planets In Our Solar System | Amazing Science |

The possibility of a planet lurking in the outer reaches of the solar system has gained new ground, based on the orbits of recently discovered objects. There is a new twist to the latest evidence, however, with suggestions of not one but two large planets at mind-bending distances from the Sun.

The quest for a "Planet X" beyond Neptune has been going on for more than a century. Recently, two dwarf planets Senda and 2102 VP113 have been identified with orbits extending to distances hundreds of times further from the Sun than our own. 

Distant as these orbits are, they are too close to be part of the Oort Cloud, a collection of comets that mostly orbit at distances beyond 5000 AU.

Instead it is thought that these objects formed closer to the sun. The gravitational influence of a large planet is one explanation of how their orbits changed. The theory has its own problems – if we can’t explain how objects like these came to be orbiting at such distances, then it’s equally unclear how a theoretical planet came to be there.

Scott Sheppard, of the Carnegie Institution for Science, and the Gemini Observatory's Chad Trujillo noted a clustering in the orbits of the solar system’s most distant known entities,many of which they had discovered. Ten Kuiper Belt Objects, and minor planets Sedna and 2012 VP113, all have orbits that cross the plane of the solar system at angles that range from shallow to steep. Yet all of these distant objects reach their closest point to the sun just when they are near the plane the planets circle in. The scientists considered this unlikely to be a coincidence, and speculate it might be a sign of a planet influencing all of their orbits.

In Monthly Notices of the Royal Astronomical Society Letters brothers Carlos and Raul de la Fuente Marcos of Complutense University of Madrid have taken this a step further. “The analysis of several possible scenarios strongly suggest that at least two trans-Plutonian planets must exist,” they conclude.

Even more recently, Lorenzo Iorio of the Italian Ministry of Education, Universities and Research has argued in the same journal that if planet X exists, it must be much further out than Trujillo and Sheppard proposed. How far it would need to be depends on its mass, but an unknown object twice as heavy as the Earth could not be less than 500 AU from the Sun, Iorio maintains.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Spiral laser beam used to create a whirlpool of hybrid light-matter particles called polaritons

Spiral laser beam used to create a whirlpool of hybrid light-matter particles called polaritons | Amazing Science |

Physicists at Australian National University have engineered a spiral laser beam and used it to create a whirlpool of hybrid light-matter particles called polaritons.  "Creating circulating currents of polaritons – vortices – and controlling them has been a long-standing challenge," said leader of the team, theoretician Dr Elena Ostrovskaya, from the Research School of Physics and Engineering. "We can now create a circulating flow of these hybrid particles and sustain it for hours."

Polaritons are hybrid particles that have properties of both matter and light. The ability to control polariton flows in this way could aid the development of completely novel technology to link conventional electronics with new laser and fibre-based technologies.

Polaritons form in semiconductors when laser light interacts with electrons and holes (positively charged vacancies) so strongly that it is no longer possible to distinguish light from matter.

The team created the spiral beam by putting their laser through a piece of brass with a spiral pattern of holes in it. This was directed into a semiconductor microcavity, a tiny wafer of aluminium gallium arsenide, a material used in LEDs, sandwiched between two reflectors. "The vortices have previously only appeared randomly, and always in pairs that swirl in opposite directions," said Dr Robert Dall, who led the experimental part of the project. "However, by using a spiral mask to structure our laser, we create a chiral system that prefers one flow direction. Therefore we can create a single, stable vortex at will."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Google joins the effort to combat overfishing, with Global Fishing Watch

Google joins the effort to combat overfishing, with Global Fishing Watch | Amazing Science |
Google has partnered with SkyTruth and Oceana to produce a new tool to track global fishing activity. Known as Global Fishing Watch, the interactive web tool uses satellite data to provide detailed vessel tracking, and aims to harness the power of citizen engagement to tackle the issue of overfishing.

According to the United Nations Food and Agriculture Organization, more than 90 percent of the world’s fisheries are working at peak capacity, with as much as one-third of marine fish stocks now suffering from overfishing.

Though a clear issue, the distant and out-of-sight nature of commercial fishing creates a problem when it comes to accountability. To help combat this, Google has teamed up with marine advocacy group Oceana and mapping company SkyTruth to develop the Global Fishing Watch – a tool that allows anyone with an internet connection access to the timing and position of intensive fishing around the world.

Currently in the prototype stage, the tool makes use of Automatic Identification System (AIS) satellite location data – a tool initially designed to help avoid maritime collisions. The system analyses the movement pattern of each ship to determine whether it is indeed a fishing vessel, before plotting its activity on an interactive map.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

NASA's new hi-res computer model gives scientists a stunning new look at how CO2 travels around the globe

NASA's new hi-res computer model gives scientists a stunning new look at how CO2 travels around the globe | Amazing Science |

Plumes of carbon dioxide in the simulation swirl and shift as winds disperse the greenhouse gas away from its sources. The simulation also illustrates differences in carbon dioxide levels in the northern and southern hemispheres and distinct swings in global carbon dioxide concentrations as the growth cycle of plants and trees changes with the seasons.

Scientists have made ground-based measurements of carbon dioxide for decades and in July NASA launched the Orbiting Carbon Observatory-2 (OCO-2) satellite to make global, space-based carbon observations. But the simulation - the product of a new computer model that is among the highest-resolution ever created - is the first to show in such fine detail how carbon dioxide actually moves through the atmosphere.

"While the presence of carbon dioxide has dramatic global consequences, it's fascinating to see how local emission sources and weather systems produce gradients of its concentration on a very regional scale," said Bill Putman, lead scientist on the project from NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Simulations like this, combined with data from observations, will help improve our understanding of both human emissions of carbon dioxide and natural fluxes across the globe."

The carbon dioxide visualization was produced by a computer model called GEOS-5, created by scientists at NASA Goddard's Global Modeling and Assimilation Office. In particular, the visualization is part of a simulation called a "Nature Run." The Nature Run ingests real data on atmospheric conditions and the emission of greenhouse gases and both natural and man-made particulates. The model is then is left to run on its own and simulate the natural behavior of the Earth's atmosphere. This Nature Run simulates May 2005 to June 2007.

While Goddard scientists have been tweaking a "beta" version of the Nature Run internally for several years, they are now releasing this updated, improved version to the scientific community for the first time. Scientists are presenting a first look at the Nature Run and the carbon dioxide visualization at the SC14 supercomputing conference this week in New Orleans.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Elusive dark matter may be detected with GPS satellites

Elusive dark matter may be detected with GPS satellites | Amazing Science |

The everyday use of a GPS device might be to find your way around town or even navigate a hiking trail, but for two physicists, the Global Positioning System might be a tool in directly detecting and measuring dark matter, so far an elusive but ubiquitous form of matter responsible for the formation of galaxies.

Andrei Derevianko, of the University of Nevada, Reno, and his colleague Maxim Pospelov, of the University of Victoria and the Perimeter Institute for Theoretical Physics in Canada, have proposed a method for a dark-matter search with GPS satellites and other atomic clock networks that compares times from the clocks and looks for discrepancies.

"Despite solid observational evidence for the existence of dark matter, its nature remains a mystery," Derevianko, a professor in the College of Science at the University, said. "Some research programs in particle physics assume that dark matter is composed of heavy-particle-like matter. This assumption may not hold true, and significant interest exists for alternatives."

"Modern physics and cosmology fail dramatically in that they can only explain 5 percent of mass and energy in the universe in the form of ordinary matter, but the rest is a mystery." There is evidence that dark energy is about 68 percent of the mystery mass and energy. The remaining 27 percent is generally acknowledged to be dark matter, even though it is not visible and eludes direct detection and measurement.

"Our research pursues the idea that dark matter may be organized as a large gas-like collection of topological defects, or energy cracks," Derevianko said. "We propose to detect the defects, the dark matter, as they sweep through us with a network of sensitive atomic clocks. The idea is, where the clocks go out of synchronization, we would know that dark matter, the topological defect, has passed by. In fact, we envision using the GPS constellation as the largest human-built dark-matter detector."

No comment yet.