Amazing Science
758.9K views | +162 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

'Solid' light could compute previously unsolvable problems

'Solid' light could compute previously unsolvable problems | Amazing Science |
Researchers at Princeton University have begun crystallizing light as part of an effort to answer fundamental questions about the physics of matter.

The researchers are not shining light through crystal – they are transforming light into crystal. As part of an effort to develop exotic materials such as room-temperature superconductors, the researchers have locked together photons, the basic element of light, so that they become fixed in place.

"It's something that we have never seen before," said Andrew Houck, an associate professor of electrical engineering and one of the researchers. "This is a new behavior for light."

The results raise intriguing possibilities for a variety of future materials. But the researchers also intend to use the method to address questions about the fundamental study of matter, a field called condensed matter physics.

"We are interested in exploring – and ultimately controlling and directing – the flow of energy at the atomic level," said Hakan Türeci, an assistant professor of electrical engineering and a member of the research team. "The goal is to better understand current materials and processes and to evaluate materials that we cannot yet create."

The team's findings, reported online on Sept. 8 in the journal Physical Review X, are part of an effort to answer fundamental questions about atomic behavior by creating a device that can simulate the behavior of subatomic particles. Such a tool could be an invaluable method for answering questions about atoms and molecules that are not answerable even with today's most advanced computers.

In part, that is because current computers operate under the rules of classical mechanics, which is a system that describes the everyday world containing things like bowling balls and planets. But the world of atoms and photons obeys the rules of quantum mechanics, which include a number of strange and very counterintuitive features. One of these odd properties is called "entanglement" in which multiple particles become linked and can affect each other over long distances.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

'Pick 'n' Mix' chemistry to grow cultures of bioactive molecules

'Pick 'n' Mix' chemistry to grow cultures of bioactive molecules | Amazing Science |
Chemists at ETH-Zürich and ITbM, Nagoya University have developed a new method to build large libraries of bioactive molecules – which can be used directly for biological assays – by simply mixing a small number of building blocks in water.

Professor Jeffrey Bode of ETH-Zürich and the Institute of Transformative Bio-Molecules (ITbM) of Nagoya University, and his co-worker have established a new strategy called "synthetic fermentation" to rapidly synthesize a large number of bioactive molecules, which can be directly screened in biological assays simply by mixing a few building blocks in aqueous media. Using a highly selective amide-forming ligation, the reaction proceeds in high efficiency in the absence of organisms, enzymes or reagents. The fermentation products can be screened directly for biological activity without any purification. Synthetic fermentation has enabled the generation of about 6,000 unnatural peptide-like molecules from only 23 building blocks. The practicality of this approach is shown by identifying a bioactive compound for inhibiting an enzyme responsible for the replication of the hepatitis C virus. The study, published online on September 7, 2014 in Nature Chemistry as an Advanced Online Publication, is expected to be a powerful and practical method to allow rapid generation and screening of active molecules useful for drug discovery and other industrial applications as well as for simple biological assays on site.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Layered graphene sandwich for next generation electronics

Layered graphene sandwich for next generation electronics | Amazing Science |
Sandwiching layers of graphene with white graphene could produce designer materials capable of creating high-frequency electronic devices, University of Manchester scientists have found.

Writing in Nature Nanotechnology, the researchers have demonstrated how combining the two-dimensional materials in a stack could create perfect crystals capable of being used in next generation transistors.

Hexagonal boron nitride (hBN), otherwise known as white graphene, is one of a family of two-dimension materials discovered in the wake of the isolation of graphene at the University in 2004. Manchester researchers have previously demonstrated how combining 2D materials, in stacks called heterostructures, could lead to materials capable of being designed to meet industrial demands.

Now, for the first time, the team has demonstrated that the electronic behaviour of the heterostructures can be changed enormously by precisely controlling the orientation of the crystalline layers within the stacks.

The researchers, led by University of Manchester Nobel laureate Sir Kostya Novoselov, carefully aligned two graphene electrodes separated by hBN and discovered there was a conservation of electron energy and momentum.

The findings could pave the way for devices with ultra-high frequencies, such as electronic or photovoltaic sensors.

The research was carried out with scientists from Lancaster and Nottingham Universities in the UK, and colleagues in Russia, Seoul and Japan.

Professor Laurence Eaves, a joint academic from the Universities of Manchester and Nottingham, said: ""This research arises from a beautiful combination of classical laws of motion and the quantum wave nature of electrons, which enables them to flow through barriers.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Are 75% of Australia's living species still unknown?

Are 75% of Australia's living species still unknown? | Amazing Science |

Australia may be known for its unique plants and animals, but how many do we actually know about? Jo Harding is the manager of Bush Blitz, a program supported by federal and state government agencies and research institutions, which documents plants and animals around Australia, leading to the discovery of hundreds of new species.

Ms. Harding states: "There's estimated to be about 75 per cent of Australia's biodiversity that's largely unknown. So there's certainly a lot out there still to find. We've discovered 700 new species so far, that's over the last approximately four years, and we're still counting."

The word 'biodiversity' has a complex scientific definition, but generally speaking, it is used as a catch-all phrase for all plants, animals and other living organisms in a particular area, a spokeswoman for Bush Blitz said.

It covers all types of plants (including algae) and fungi as well as vertebrates (such as mammals, reptiles, fish and birds) and invertebrates (such as insects and octopuses) in both marine and land environments.

A recent CSIRO publication on biodiversity says the scientific definition "includes more than just organisms themselves". "Its definition includes the diversity of the genetic material within each species and the diversity of ecosystems that those species make up, as well as the ecological and evolutionary processes that keep them functioning and adapting," the publication said.

"Biodiversity is not simply a list of species, therefore. It includes the genetic and functional operations that keep the living world working, so emphasizing inter-dependence of the elements of nature."

Undescribed species are species that may have been found before, maybe in different areas or by different people, but which haven't been formally identified. It is then up to an expert to examine the specimen to ensure it really is an undescribed species. The expert will then write a description for the species. Once the description of the new species has been established and published, it is called a described specimen.

Ms Harding's claim that about 75 per cent of Australia's biodiversity is unknown is based on a 2009 report published by the federal environment department. It aggregates information from a large number of sources and previous studies to calculate the number of species already discovered and estimate the number of species yet to be discovered both around the world and in Australia.

It determined that Australia had 147,579 "accepted described species", 26 per cent of its estimated total Australian species.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Ultraviolet light-induced mutation drives many skin cancers and over 330 genes seem to be involved in the process

Ultraviolet light-induced mutation drives many skin cancers and over 330 genes seem to be involved in the process | Amazing Science |
Genes that cause cancer when mutated are known as oncogenes. Although KNSTRN hasn't been previously implicated as a cause of human cancers, the research suggests it may be one of the most commonly mutated oncogenes in the world.

"This previously unknown oncogene is activated by sunlight and drives the development of cutaneous squamous cell carcinomas," said Paul Khavari, MD, PhD, the Carl J. Herzog Professor in Dermatology in the School of Medicine and chair of the Department of Dermatology. "Our research shows that skin cancers arise differently from other cancers, and that a single mutation can cause genomic catastrophe."

Cutaneous squamous cell carcinoma is the second most common cancer in humans. More than 1 million new cases are diagnosed globally each year. The researchers found that a particular region of KNSTRN is mutated in about 20 percent of cutaneous squamous cell carcinomas and in about 5 percent of melanomas.

A paper describing the research will be published online Sept. 7, 2014 in Nature Genetics. Khavari, who is also a member of the Stanford Cancer Institute and chief of the dermatology service at the Veterans Affairs Palo Alto Health Care System, is the senior author of the paper. Postdoctoral scholar Carolyn Lee, MD, PhD, is the lead author.

Lee and Khavari made the discovery while investigating the genetic causes of cutaneous squamous cell carcinoma. They compared the DNA sequences of genes from the tumor cells with those of normal skin and looked for mutations that occurred only in the tumors. They found 336 candidate genes for further study, including some familiar culprits. The top two most commonly mutated genes were CDKN2A and TP53, which were already known to be associated with squamous cell carcinoma.

The third most commonly mutated gene, KNSTRN, was a surprise. It encodes a protein that helps to form the kinetochore -- a structure that serves as a kind of handle used to pull pairs of newly replicated chromosomes to either end of the cell during cell division. Sequestering the DNA at either end of the cell allows the cell to split along the middle to form two daughter cells, each with the proper complement of chromosomes.

If the chromosomes don't separate correctly, the daughter cells will have abnormal amounts of DNA. These cells with extra or missing chromosomes are known as aneuploid, and they are often severely dysfunctional. They tend to misread cellular cues and to behave erratically. Aneuploidy is a critical early step toward the development of many types of cancer.

The mutation in the KNSTRN gene was caused by the replacement of a single nucleotide, called a cytosine, with another, called a thymine, within a specific, short stretch of DNA. The swap is indicative of a cell's attempt to repair damage from high-energy ultraviolet rays, such as those found in sunlight.

"Mutations at this UV hotspot are not found in any of the other cancers we investigated," said Khavari. "They occur only in skin cancers." The researchers found the UV-induced KNSTRN mutation in about 20 percent of actinic keratoses -- a premalignant skin condition that often progresses to squamous cell carcinoma -- but never in 122 samples of normal skin, indicating the mutation is likely to be an early event in the development of squamous cell carcinomas.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Super secure quantum encryption for everyone available soon

Super secure quantum encryption for everyone available soon | Amazing Science |

The largest information technology agreement ever signed by Los Alamos National Laboratory brings the potential for truly secure data encryption to the marketplace after nearly 20 years of development at the nation's premier national-security science laboratory.

With a new device set to make unbreakable, quantum-based cryptographic security available for everyone for the very first time, ordinary people will be able to use cryptographic systems that – until recently – only existed as experiments in the most advanced physics laboratories.

Using technology developed at the Los Alamos National Laboratory (LANL) and incorporating the quantum mechanics of random photon polarization, the new device generates random numbers and creates cryptographic keys so fast and so securely that the technology is said to revolutionize high-speed cryptography and offer a completely new commercial platform for real-time encryption at high data rates.

This claimed breakthrough is made possible by taking advantage of the various spin states of photons. In line with quantum wave theory, a photon exists in all spin states at once. However, if a photon is passed through a polarizing filter that rejects given spin states, the photon can be made to exhibit just one of four possible states of spin – vertical, horizontal, left, or right.

In this way, random filters may be applied to photons, which in turn, represent ones or zeroes of binary data, dependent on the state of spin selected and the binary notation attributed to it.

However, in accordance with Heisenberg's Uncertainty Principle, once the photon is polarized we can not then accurately measure it again, unless we apply a filter to it at the end of its journey just like the one it went through at the start to measure its spin state. This means that – provided you know the filter sequence required to decode the incoming photon stream – only the receiver can then read off the encoded data.

More importantly, anyone attempting to intercept the resulting data stream cannot eavesdrop on the transmission because any attempted observation of a quantum system also alters it, and the quantum state changes resulting from attempted unauthorized reading would be immediately detected.

LANL has partnered with Whitewood Encryption Systems to market this device which, when released, may well effectively render any other conventional random number generation system system obsolete. Current systems based on mathematical formulas that can be broken by a computer with sufficient speed and power will not be able to compete with a system that is built on a truly random system that cannot be second-guessed.

"Quantum systems represent the best hope for truly secure data encryption because they store or transmit information in ways that are unbreakable by conventional cryptographic methods," said Duncan McBranch, Chief Technology Officer at LANL. "This licensing agreement with Whitewood Encryption Systems, Inc. is historic in that it takes our groundbreaking technical work that was developed over two decades into commercial encryption applications."

denis binette's curator insight, September 7, 2014 10:13 PM

Soon.  How long before? 

Miro Svetlik's curator insight, September 10, 2014 4:36 AM

Finally it is quantum encryption making its way to the masses in form of a relatively small device. I hope this will solve the secure communication on the internet at least for some while.

Scooped by Dr. Stefan Gruenwald!

Intel's Core M processor is the first product manufactured using 14nm technology

Intel's Core M processor is the first product manufactured using 14nm technology | Amazing Science |

Intel on Monday provided details about the microarchitecture of the Intel Core M processor, which is the first product to be manufactured using 14nm technology. As such, the world is in for a taste of a 14-nanometer chip. AnandTech also said that "Core M will be launch vehicle for Broadwell and will be released for the holiday period this year." Intel executives provided some of the first details on the chips built using Intel technology. Providing some context to the event, CNET on Monday observed how Intel and other chip companies have been racing to advance processor technologies "by shrinking the geometries of the chips." CNET said the race looks as if Intel is ahead of the pack, with processors built at 14 nanometers, or billionths of a meter. AnandTech commented: "Intel appears to be back on track. 14nm is in volume production in time for Broadwell-Y to reach retail before the end of the year."

What does the Core M mean for manufacturers and consumers? CNET said, for one result, the Intel chip is to allow PC makers to build much thinner and lighter devices. In all, the Intel move to a14 nanometer chip from a 22-nanometer chip can translate into devices that are "thinner, lighter, more power-efficient, and don't need a fan," said CNET. The Wall Street Journal said, "The first chip based on the new production process—which is called the Intel Core M and based on a design called Broadwell —will be targeted at tablets and other devices that operate without a cooling fan but are as thin as nine millimeters or less."

Intel's own statement said, "The combination of the new microarchitecture and manufacturing process will usher in a wave of innovation in new form factors, experiences and systems that are thinner and run silent and cool."

As for process, "Intel's 14 nanometer technology uses second-generation Tri-gate transistors to deliver industry-leading performance, power, density and cost per transistor," said Mark Bohr, Intel senior fellow, technology and manufacturing Group, and director, process architecture and integration. "Intel's investments and commitment to Moore's law is at the heart of what our teams have been able to accomplish with this new process."

CNET noted the first systems using Core M will reach shelves for the holiday period, and the bulk of new devices will be available in the first half of 2015. Gizmodo remarked, "We'll most likely see Core M branding on the boxes of select tablet devices this holiday season with even more laptop and PCs hopping on board in early 2015."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New deep sea mushroom-shaped organisms discovered with similarities to 600 million year old extinct creatures

New deep sea mushroom-shaped organisms discovered with similarities to 600 million year old extinct creatures | Amazing Science |
Scientists discovered two new species of sea-dwelling, mushroom-shaped organisms, according to a study published September 3, 2014 in the open-access journal PLOS ONE by Jean Just from University of Copenhagen, Denmark, and colleagues.

Scientists classify organisms based on shared characteristics using a taxonomic rank, including kingdom, phylum, and species. In 1986, the authors of this study collected organisms at 400 and 1000 meters deep on the south-east Australian continental slope and only just recently isolated two types of mushroom-shaped organisms that they couldn't classify into an existing phylum.

The new organisms are multicellular and mostly non-symmetrical, with a dense layer of gelatinous material between the outer skin cell and inner stomach cell layers. The organisms were classified as two new species in a new genusDendrogramma enigmatica and Dendrogramma discoides, in the new family, Dendrogrammatidae. Scientists found similarities between the organisms and members of Ctenophora and Cnidaria and suggest that they may be related to one of these phyla. Scientists also found similarities to 600 million year-old Pre-Cambrian extinct life forms, suggested by some to be early but failed attempts at multi-cellular life.

The authors originally preserved the specimens in neutral formaldehyde and stored them in 80% ethanol, which makes them unsuitable for molecular analysis. However, they suggest attempting to secure new samples for further study, which may provide further insight into their relationship to other organisms.

Jørgen Olesen added: "New mushroom-shaped animals from the deep sea discovered which could not be placed in any recognized group of animals. Two species are recognized and current evidence suggest that they represent an early branch on the tree of life, with similarities to the 600 mill old extinct Ediacara fauna."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientists have for the first time mapped the atomic structure of a protein within a living cell

Scientists have for the first time mapped the atomic structure of a protein within a living cell | Amazing Science |

The technique, which peered into cells with an X-ray laser, could allow scientists to explore some components of living cells as never before.

The research, published Aug.18, 2014 in Proceedings of the National Academy of Sciences, was conducted at the Department of Energy's SLAC National Accelerator Laboratory.

"This is a new way to look inside cells," said David S. Eisenberg, a biochemistry professor at University of California, Los Angeles, and Howard Hughes Medical Institute investigator.

"There are a lot of semi-ordered materials in cells where an X-ray laser could provide powerful information," Eisenberg added. They include arrays in white blood cells that help to fight parasites and infections, insulin-containing structures in the pancreas and structures that break fatty acids and other molecules into smaller units to release energy.

In the experiment at SLAC's Linac Coherent Light Source X-ray laser, a DOE Office of Science User Facility, researchers probed a soil-dwelling bacterium, Bacillus thuringiensis or Bt, that is commonly used as a natural insecticide. Strains of this bacterium produce microscopic protein crystals and spores that kill insects. Normally scientists need to find ways to crystallize proteins in order to get their structures – typically a time-consuming, hit-and-miss process – but these naturally occurring crystals eliminated that step.

A liquid solution containing the living cells was jetted into the path of the ultrabright LCLS X-ray laser pulses. When a laser pulse struck a crystal, it created a pattern of diffracted X-ray light. More than 30,000 of these patterns were combined and analyzed by sophisticated software to reproduce the detailed 3-D structure of the protein.

Many of the bacterial cells likely ruptured and spewed their crystal contents as they flew at high speed toward the X-rays. But because it took just thousandths of a second for the cells to reach the X-ray pulses, it's very likely that many of the X-ray images showed protein crystals that were still inside the cells, the researchers concluded.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Atomically thin molybendum disulfide opens door to high-speed integrated nanophotonic circuits

Atomically thin molybendum disulfide opens door to high-speed integrated nanophotonic circuits | Amazing Science |

Scientists at the University of Rochester and Swiss Federal Institute of Technology in Zurich have devised an experimental circuit consisting of a silver nanowire and a single-layer atomically thin flake of molybendum disulfide (MoS2) — a step toward building computer chips capable of transporting digital information at light speed.

The researchers used a laser to excite electromagnetic waves called plasmons (vibrating electron clouds) at the surface of the wire, causing an MoS2 flake at the far end of the wire to generate strong light emission. MoS2 excitons can also decay into nanowire plasmons, they found.

This interaction an be exploited for creating nanophotonic integrated circuits, said Nick Vamivakas, assistant professor of quantum optics and quantum physics at the University of Rochester and senior author of the paper in the journal Optica.

Photonic devices can be much faster than electronic ones, but they are bulkier and cannot be miniaturized nearly as well as electronic circuits. The new results hold promise for guiding the transmission of light and maintaining the intensity of the signal in very small dimensions.

In bulk MoS2, electrons and photons interact as they would in traditional semiconductors like silicon and gallium arsenide. But when MoS2 is trimmed down to an atomically thin layer, the transfer of energy between electrons and photons becomes highly efficient.*

Combining electronics and photonics on the same integrated circuits could drastically improve the performance and efficiency of mobile technology. The researchers say the next step is to create a near-field detector based on MoS2 and an MoS2 light-emitting diode coupled to on-chip nanoplasmonic circuitry.

* The key to MoS2′s desirable photonic properties is in the structure of its energy band gap. As the material’s layer count decreases, it transitions from an indirect to direct band gap, which allows electrons to easily move between energy bands by releasing photons. Graphene is inefficient at light emission because it has no band gap.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Gut Microbe That Protects Against Peanut Allergy

The Gut Microbe That Protects Against Peanut Allergy | Amazing Science |

The presence of a common gut microbe called Clostridia protects mice against peanut sensitization by keeping the allergens from entering their bloodstream, according to findings published in Proceedings of the National Academy of Sciences.

In the U.S., food allergy rates among children rose about 50 percent between 1997 and 2011. We don’t know what causes food allergies, though numerous studies hint that recent changes in diet and hygiene (and the use of antibiotics and antimicrobial this and that) have altered the natural community of microorganisms in our gastrointestinal tracts -- increasing our susceptibility to food allergies.

To see how altered microbiota affect immune responses to food, a team led by Cathryn Nagler from the University of Chicago exposed three groups of mice to peanut allergens: germ-free mice without any resident bacteria, mice given antibiotics as newborns to reduce their GI bacteria, and control mice with a normal cohort of GI bacteria.

Germ-free and antibiotic-treated mice showed strong immunological responses, producing higher levels of antibodies against peanuts allergens -- compared to mice with normal gut bacteria, which seem to provide some degree of protection against food allergies.

This peanut sensitization (the rodent model of human allergy) can be reversed. WhenClostridia bacteria were reintroduced into the intestines of germ-free and antibiotic-treated mice, they were no longer sensitive to peanuts. Introducing another type of common GI bacteria, called Bacteroides, failed to alleviate sensitization, further suggesting thatClostridia bacteria are the ones mediating the protection.

To identify the protective mechanism, the team looked at the immune responses on a cellular and molecular level. A gene expression analysis revealed that Clostridia induced an immune response -- the production of molecules called cytokine interleukin-22 (IL-22) -- which reduces the permeability of the lining of mouse intestines. This results in less allergen reaching the bloodstream. “The bacteria are maintaining the integrity of the [intestinal] barrier,” Nagler tells Science.

Finally, the team gave antibiotic-treated mice either IL-22 or Clostridia. When exposed to peanut allergens, mice in both conditions showed reduced allergen levels in their blood, compared to controls. Accordingly, allergen levels increased when mice were given antibodies that neutralized IL-22.

“The first step in getting sensitized to a food allergen is for it to get into your blood and be presented to your immune system,” Nagler says in a news release. “The presence of these bacteria regulates that process." Her team is working to develop and test compositions that could be used for probiotic therapy.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Pediatric Crohn disease patients exhibit specific ileal transcriptome and microbiome signature

Pediatric Crohn disease patients exhibit specific ileal transcriptome and microbiome signature | Amazing Science |

Current evidence suggests that the inflammatory bowel diseases (IBDs) Crohn disease (CD) and ulcerative colitis (UC) are caused by a complex interaction among host genetic background, microbial shifts, and environmental cues, leading to inappropriate chronic activation of the mucosal immune system (13). While it is difficult to establish causality in patient-based studies, it is reasonable to suggest that large inception cohorts that include clinical, genetic, mucosal, and microbial profiling might be the optimal way to address the diversity associated with IBD pathogenesis with adequate power.

A recent meta-analysis of IBD genetic studies identified 163 IBD risk loci (4), and many of these risk alleles exhibit infection-related balancing natural selection. Consistent with this finding, an overall dysfunction in the human gut microbial community has been described in both long-standing adult-onset IBD (5) and treatment-naive pediatric-onset IBD (6), and patients exhibit altered responses to bacterial DNA (7). Animal models have conclusively shown causality in the requirement for bacterial colonization in the development of intestinal inflammation in genetically susceptible hosts (3). However, characterization of host/microbial profiles in the affected and unaffected mucosa at the onset of disease in large patient-based cohorts has been lacking.

Interactions between the host and gut microbial community likely contribute to Crohn disease (CD) pathogenesis; however, direct evidence for these interactions at the onset of disease is lacking. Here, we characterized the global pattern of ileal gene expression and the ileal microbial community in 359 treatment-naive pediatric patients with CD, patients with ulcerative colitis (UC), and control individuals. A team of scientists now identified core gene expression profiles and microbial communities in the affected CD ilea that are preserved in the unaffected ilea of patients with colon-only CD but not present in those with UC or control individuals; therefore, this signature is specific to CD and independent of clinical inflammation. An abnormal increase of antimicrobial dual oxidase (DUOX2) expression was detected in association with an expansion of Proteobacteria in both UC and CD, while expression of lipoprotein APOA1 gene was down-regulated and associated with CD-specific alterations in Firmicutes. The increased DUOX2 and decreased APOA1 gene expression signature favored oxidative stress and Th1 polarization and was maximally altered in patients with more severe mucosal injury.

A regression model that included APOA1 gene expression and microbial abundance more accurately predicted month 6 steroid-free remission than a model using clinical factors alone. These CD-specific host and microbe profiles identify the ileum as the primary inductive site for all forms of CD and may direct prognostic and therapeutic approaches.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Coffee Genome Sequenced and Annotated

Coffee Genome Sequenced and Annotated | Amazing Science |

The coffee genome has been sequenced for the first time, brewing up a better understanding of that flavor, aroma, and buzz we love (and need) so much. According to the findings, published in Science this week, the caffeine-producing enzymes in coffee evolved independently from those in tea and chocolate. 

“The coffee genome helps us understand what’s exciting about coffee -- other than that it wakes me up in the morning,” says SUNY Buffalo’s Victor Albert in a news release. “By looking at which families of genes expanded in the plant, and the relationship between the genome structure of coffee and other species, we were able to learn about coffee’s independent pathway in evolution, including -- excitingly -- the story of caffeine.”

Commonly known as robusta coffee, Coffea canephora makes up 30 percent of the coffee produced worldwide -- which totals 8.7 million tons a year or 2.25 billion cups a day. The less acidic-tasting Coffea arabica makes up most of the rest, but this lower caffeine variety has a more complicated genome. 

So, to derive a draft genome of Coffea canephora, a huge consortium led by Albert and researchers from the French Institute of Research for Development and the French National Sequencing Center pieced together DNA sequences and assembled a total length of 568.6 megabases -- that’s 80 percent of the plant’s 710-megabase genome.

After running a comparative genomics software on protein sequences from coffee, grape, tomato, and a flowering plant called Arabidopsis, the team identified 16,000 genes that originated from a single gene in their last common ancestor. They were also able to pinpoint adaptations in genes for disease resistance and caffeine production that were unique to coffee. Overall, the team isolated 25,574 protein-making genes in the Coffea canephora genome and 23 new genes that are only found in coffee.

The robusta coffee genome also revealed that the enzymes involved in coffee’s caffeine production -- called N-methyltransferases -- adapted independently from those in cacao and tea. That is, they didn’t inherit their caffeine-linked genes from a common ancestor: The ability to produce caffeine must have evolved at least twice, and long before we started depending on it.

But what good is caffeine for plants? It may protect the coffee plant from predators like leaf-eating bugs, and when their leaves fall on the ground, the high caffeine concentration stunts the growth of rival plants trying to develop near them. “Caffeine also habituates pollinators and makes them want to come back for more, which is what it does to us, too,”Albert tells Nature. Furthermore, over evolutionary time, the coffee genome wasn't triplicated or duplicated en masse. Instead, the team team thinks that the duplication of individual genes, including the caffeine ones, spurred innovations, Science explains.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Oxytricha trifallax breaks its own DNA into a quarter-million pieces and rapidly reassemble those for mating

Oxytricha trifallax breaks its own DNA into a quarter-million pieces and rapidly reassemble those for mating | Amazing Science |

Life can be so intricate and novel that even a single cell can pack a few surprises, according to a study led by Princeton University researchers. The pond-dwelling, single-celled organism Oxytricha trifallax has the remarkable ability to break its own DNA into nearly a quarter-million pieces and rapidly reassemble those pieces when it's time to mate, the researchers report in the journal Cell.

The organism internally stores its genome as thousands of scrambled, encrypted gene pieces. Upon mating with another of its kind, the organism rummages through these jumbled genes and DNA segments to piece together more than 225,000 tiny strands of DNA. This all happens in about 60 hours.

The organism's ability to take apart and quickly reassemble its own genes is unusually elaborate for any form of life, explained senior author Laura Landweber, a Princeton professor of ecology and evolutionary biology. That such intricacy exists in a seemingly simple organism accentuates the "true diversity of life on our planet," she said.

"It's one of nature's early attempts to become more complex despite staying small in the sense of being unicellular," Landweber said. "There are other examples of genomic jigsaw puzzles, but this one is a leader in terms of complexity. People might think that pond-dwelling organisms would be simple, but this shows how complex life can be, that it can reassemble all the building blocks of chromosomes."

From a practical standpoint, Oxytricha is a model organism that could provide a template for understanding how chromosomes in more complex animals such as humans break apart and reassemble, as can happen during the onset of cancer, Landweber said. While chromosome dynamics in cancer cells can be unpredictable and chaotic, Oxytricha presents an orderly step-by-step model of chromosome reconstruction, she said.

"It's basically bad when human chromosomes break apart and reassemble in a different order," Landweber said. "The process in Oxytricha recruits some of the same biological mechanisms that normally protect chromosomes from falling apart and uses them to do something creative and constructive instead."

Gertraud Burger, a professor of biochemistry at the University of Montreal, said that the "rampant and diligently orchestrated genome rearrangements that take place in this organism" demonstrate a unique layer of complexity for scientists to consider when it comes to studying an organism's genetics.

"This work illustrates in an impressive way that the genetic information of an organism can undergo substantial change before it is actually used for building the components of a living cell," said Burger, who is familiar with the work but had no role in it.

"Therefore, inferring an organism's make-up from the genome sequence alone can be a daunting task and maybe even impossible in certain instances," Burger said. "A few cases of minor rearrangements have been described in earlier work, but these are dilettantes compared to [this] system."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Create a personalized high-performance computer by crossing the barriers between clouds

Create a personalized high-performance computer by crossing the barriers between clouds | Amazing Science |

The Information Technology Research Institute of the National Institute of Advanced Industrial Science and Technology has developed a technology with which once an environment to perform high-performance computing has been established, a virtual cluster-type computer can easily be built on a different cloud and made available for immediate use.

Generally, in high-performance computing, cluster-type computers where many computers are bundled and run as a single computer are used. However, their hardware configuration is not uniform. On the other hand, virtual computers that are not dependent on hardware configuration are provided in clouds, and by bundling them together, a virtual cluster-type computer can be created. However, in this case, the user had to re-install software or reset the settings for a different cloud. Therefore, a technology to build a virtual cluster-type computer based on the design concept of "Build Once, Run Everywhere" has been developed. Once the environment to run the application has been established it may be run on any cloud, be it a private, commercial, or other cloud. Furthermore, since there are no constraints on the number of virtual computers that can be incorporated into the cluster, when the computing power is insufficient, an even larger virtual cluster-type computer can be formed on another cloud that allows the use of even more virtual computers, but allowing it to be used in exactly the same manner.

A virtual cluster-type computer was formed on AIST's private cloud, AIST Super Green Cloud (ASGC), and the ability to use it on Amazon EC2, a commercial cloud, was verified. With this technology, users and application fields that could not use high-performance computing previously can now use high-performance computing. Thus, the developed technology is expected to contribute to the enhancement of industrial competitiveness.

There are many research organizations and companies that require high-performance computing, such as in the development of automobiles and for drug discovery. Conventionally, each organization prepared cluster-type computers within their organization. This required the introduction of a system with even higher performance to solve problems exceeding its computing capacity. Further, it was not readily available for introduction when it was required.

In clouds now widely available today, computing performance can be increased through the addition of computers by bundling virtual computers to form a cluster-type computer. However, when the built environment is to be re-created on a different cloud, it required the software to be re-installed and the settings reconfigured, necessitating extra time, labor, and cost.

Furthermore, because initial introduction and operating costs for cluster-type computers are high, the environment for high-performance computing could not be maintained, especially for small- and medium-scale enterprises. Expansion of the fields in which high-performance computing can be applied in support of such users is required for the enhancement of industrial competitiveness.

AIST is conducting R&D aimed at achieving a high-performance computing infrastructure with both the convenience to run on any cluster-type computer once a high-performance computing application-executing environment has been created, and high computing performance. In the process, R&D was conducted under the concept of separating the application-executing environment from actual machines by virtualization using cloud technologies to establish cluster-type computers on various clouds as required. In addition, although a cloud is established with virtualization technology, in the field of high-performance computing, there has been an issue of a drop in computing performance when virtualized, which has hindered its popularization. Therefore, evaluation of the effects of virtualization when executing high-performance computing applications was conducted in detail and technologies to reduce the deterioration of performance caused by virtualization have been developed.

SaaS Guru's curator insight, September 11, 2014 8:53 AM

Un peu technique mais plein de promesses!

Scooped by Dr. Stefan Gruenwald!

Vaccinated monkeys show "long-term" immunity to the Ebola virus, raising a prospect of successful human trials

Vaccinated monkeys show "long-term" immunity to the Ebola virus, raising a prospect of successful human trials | Amazing Science |

The experiments by the US National Institutes of Health showed immunity could last at least 10 months. Human trials of the vaccine started this week in the US and will extend to the UK and Africa.

The World Health Organization says more than 2,000 people have now died in the outbreak in West Africa. Several experimental treatments are now being considered to help contain the spread of Ebola.

This includes a vaccine being developed by the US National Institute of Allergy and Infectious Diseases and pharmaceutical company GlaxoSmithKline. It uses a genetically modified chimp virus containing components of two species of Ebola - Zaire, which is currently circulating in West Africa, and the common Sudan species.

The viral vaccine does not replicate inside the body, but it is hoped the immune system will react to the Ebola component of the vaccine and develop immunity.

The first patient, a 39-year-old woman, was given the vaccine last week as human trials got under way. There will also be separate trials of the vaccine against just the Zaire Ebola species.

These will take place in the US, the University of Oxford in the UK as well as in Mali and Gambia. People will be given just the initial jab, not a follow-up booster, in the trials.

The WHO said safety data would be ready by November 2014 and, if the vaccine proved safe, it would be used in West Africa immediately.

Healthcare workers and other frontline staff would be prioritised for vaccination.

The number of doses currently available is between 400 - if a lot of vaccine is needed for immunity - and 4,000 if smaller amounts are sufficient. As with all experimental therapies, the WHO has warned hopes of a vaccine must not detract from the proven methods of infection control which have defeated all previous outbreaks.

Prof Jonathan Ball, a virologist at the University of Nottingham, said: "This is really encouraging data.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientist uncovers Mars' climate history in unique meteorite known as Black Beauty

Scientist uncovers Mars' climate history in unique meteorite known as Black Beauty | Amazing Science |

Research underway at the National High Magnetic Field Laboratory may one day answer those questions — and perhaps even help pave the way for future colonization of the Red Planet. By analyzing the chemical clues locked inside an ancient Martian meteorite known as Black Beauty, Florida State University Professor Munir Humayun and an international research team are revealing the story of Mars’ ancient, and sometimes startling, climate history.

The team’s most recent finding of a dramatic climate change appeared in Nature Geoscience, in the paper “Record of the ancient Martian hydrosphere and atmosphere preserved in zircon from a Martian meteorite.”

The scientists found evidence for the climate shift in minerals called zircons embedded inside the dark, glossy meteorite. Zircons, which are also abundant in the Earth’s crust, form when lava cools. Among their intriguing properties, Humayun says, is that “they stick around forever.”

“When you find a zircon, it’s like finding a watch,” Humayun said. “A zircon begins keeping track of time from the moment it’s born.”

Last year, Humayun’s team correctly determined that the zircons in its Black Beauty sample were an astonishing 4.4 billion years old. That means, Humayun says, it formed during the Red Planet’s infancy and during a time when the planet might have been able to sustain life.

“First we learned that, about 4.5 billion years ago, water was more abundant on Mars, and now we’ve learned that something dramatically changed that,” said Humayun, a professor of geochemistry. “Now we can conclude that the conditions that we see today on Mars, this dry Martian desert, must have persisted for at least the past 1.7 billion years. We know now that Mars has been dry for a very long time.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Lactic acid bacteria from the honey bees could be the source for efficient treatment of MRSA

Lactic acid bacteria from the honey bees could be the source for efficient treatment of MRSA | Amazing Science |

Could honeybees' most valuable contribution to mankind besides pollination services be alternative tools against infections? Today, due to the emerging antibiotic-resistant pathogens, we are facing a new era of searching for alternative tools against infections. Natural products such as honey have been applied against human's infections for millennia without sufficient scientific evidence. A unique lactic acid bacterial (LAB) microbiota was discovered by us, which is in symbiosis with honeybees and present in large amounts in fresh honey across the world. This work investigates if the LAB symbionts are the source to the unknown factors contributing to honey's properties.

Hence, a group of researchers at Lund University have tested the LAB against severe wound pathogens such as methicillin-resistant Staphylococcus aureus (MRSA), Pseudomonas aeruginosa and vancomycin-resistant Enterococcus (VRE) among others. They were able to demonstrate a strong antimicrobial activity from each symbiont and a synergistic effect, which counteracted all the tested pathogens. The mechanisms of action are partly shown by elucidating the production of active compounds such as proteins, fatty acids, anaesthetics, organic acids, volatiles and hydrogen peroxide. The team showed that the symbionts produce a myriad of active compounds that remain in variable amounts in mature honey. Further studies are now required to investigate if these symbionts have a potential in clinical applications as alternative tools against topical human and animal infections.

"Antibiotics are mostly one active substance, effective against only a narrow spectrum of bacteria. When used alive, these 13 lactic acid bacteria produce the right kind of antimicrobial compounds as needed, depending on the threat. It seems to have worked well for millions of years of protecting bees' health and honey against other harmful microorganisms. However, since store-bought honey doesn't contain the living lactic acid bacteria, many of its unique properties have been lost in recent times", explains Tobias Olofsson.

The next step is further studies to investigate wider clinical use against topical human infections as well as on animals. The findings have implications for developing countries, where fresh honey is easily available, but also for Western countries where antibiotic resistance is seriously increasing.

Reference: Lactic acid bacterial symbionts in honeybees – an unknown key to honey's antimicrobial and therapeutic activities

No comment yet.
Scooped by Dr. Stefan Gruenwald!

FDA-Approved 3D Printed Face Implant is a First

FDA-Approved 3D Printed Face Implant is a First | Amazing Science |

3D printed organsskulls and vertebrae are just a few of the ways 3D printing can literally be a part of us. On Tuesday, biomedical devices company Oxford Performance Materials (OPM) announced the latest addition to 3D printed body parts: a 3D printed face.

OPM has received official FDA approval for a 3D printed facial device that can be used on patients in need of facial reconstructive surgery. The 3D printed OsteoFab® Patient-Specific Facial Device (OPSFD), which is the first and only FDA cleared 3D printed polymeric facial implant, is entirely customizable. It is made of different 3D printed parts that are made to fit each individual patient’s anatomical features.

What is equally revolutionary about the 3D printed facial implant is the drastic reduction in price it brings to facial reconstructive surgery. As it is tailor-made to each patient, the OPSFD reduces overall cost of ownership of a facial implant by reducing operating time, hospital stay duration and the chance of procedure complications. It also minimizes time before surgery as the implant can be 3D printed quickly.

Scott DeFelice, the CEO of OPM, referred to the FDA’s approval of the OPSFD as a paradigm shift:

“There has been a substantial unmet need in personalized medicine for truly individualized – yet economical – solutions for facial reconstruction, and the FDA’s clearance of OPM’s latest orthopedic implant marks a new era in the standard of care for facial reconstruction. Until now, a technology did not exist that could treat the highly complex anatomy of these demanding cases.

With the clearance of our 3D printed facial device, we now have the ability to treat these extremely complex cases in a highly effective and economical way, printing patient-specific maxillofacial implants from individualized MRI or CT digital image files from the surgeon. This is a classic example of a paradigm shift in which technology advances to meet both the patient’s needs and the cost realities of the overall healthcare system.”

Oxford Performance Materials also developed the first and only 3D printed customizable skull implant, which was approved by the FDA in February 2013 and later used to replace 75% of a patient’s skull. According to the president of OPM’s biomedical division, the two implants can now be used together for more complex cases.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Plant genomics: Methods for obtaining large, phylogenomic data sets

Plant genomics: Methods for obtaining large, phylogenomic data sets | Amazing Science |

The use of next-generation sequencing (NGS) technologies in phylogenetic studies is in a state of continual development and improvement. Though the botanically-inclined have historically focused on markers from the chloroplast genome, the importance of incorporating nuclear data is becoming increasingly evident. Nuclear genes provide not only the potential to resolve relationships between closely related taxa, but also the means to disentangle hybridization and better understand incongruences caused by incomplete lineage sorting and introgression.

By harnessing the power of NGS—which has increased sequencing capacity by several orders of magnitude over the past few years—scientists are now able to easily sequence enormous amounts of DNA or RNA from any genome within an organism, a practice that is transforming many areas of plant biology.

A team of international scientists, led by researchers at Oregon State University, has utilized a recently developed method to assemble a phylogenomic data set containing hundreds of nuclear loci and plastomes for milkweeds.

"This approach, termed Hyb-Seq, uses targeted sequence capture via hybridization-based enrichment and has shown great promise for obtaining large nuclear data sets," explains Dr. Aaron Liston, principal investigator of the study. "Sequencing low-copy nuclear genes has traditionally required a large amount of effort for each gene. Hyb-Seq eliminates the need for PCR optimization and cloning—two time-consuming and sometimes problematic steps."

The protocol is freely available in the September issue of Applications in Plant SciencesWhile it would be ideal to simply sequence entire genomes for every organism being studied, this is not yet feasible across large numbers of species. The Hyb-Seq approach reduces genomic complexity of the organism-of-interest by targeting only a small portion of the total genome. This is achieved by hybridizing DNA or RNA probes to specific regions of the genome, then simply discarding the remaining, unwanted regions.

"The probe design was done bioinformatically by comparing our draft sequence of the milkweed genome and transcriptome (expressed genes) to another genome in the same family and to genes that are conserved across the asterids and the angiosperms," explains Liston. "This allowed us to eliminate duplicated genes that can complicate phylogenetic inference and select relatively conserved genes, so that they could be obtained from divergent milkweed species with a single probe set."

This approach enabled Liston and colleagues to sequence over 700 genes for 10 Asclepias species and two related genera. "Furthermore," says Liston, "we were able to assemble complete plastomes from the off-target reads."

"It is likely that as sequencing technology advances, it will be feasible in the next decade or so to sequence complete genomes routinely and inexpensively. However, until that time, the ability to sequence hundreds of genes at a time—as is possible with the Hyb-Seq method—represents a significant and exciting advance over previous methods."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Membrane of banked red blood cells grows stiffer with age, study finds

Membrane of banked red blood cells grows stiffer with age, study finds | Amazing Science |

It may look like fresh blood and flow like fresh blood, but the longer blood is stored, the less it can carry oxygen into the tiny microcapillaries of the body, says a new study from University of Illinois researchers.

Using advanced SLIM topography, the researchers measured the stiffness of the membrane surrounding red blood cells over time. They found that, even though the cells retain their shape and hemoglobin content, the membranes get stiffer, which steadily decreases the cells’ functionality. 

Led by electrical and computer engineering professor Gabriel Popescu, the team published its results in the journal Scientific Reports.

“Our results show some surprising facts: Even though the blood looks good on the surface, its functionality is degrading steadily with time,” said Popescu, who is also part of the Beckman Institute for Advanced Science and Technology at the U. of I. 

Nearly 14 million units of blood are banked annually in the U.S. The established “shelf life” for blood in blood banks is 42 days. During that time, a lot of changes can happen to the blood cells – they can become damaged or rupture. But much of the blood keeps its shape and, by all appearances, looks like it did the day it was donated.  

Popescu and his colleagues wanted to quantitatively measure blood cells over time to see what changed and what stayed the same, to help determine what effect older blood could have on a patient. They used a special optical technique called spatial light interference microscopy (SLIM), a method developed in Popescu’s lab at Illinois in 2011. It uses light to noninvasively measure cell mass and topology with nanoscale accuracy. Through software and hardware advances, the SLIM system today acquires images almost 100 times faster than just three years ago.

The researchers took time-lapse images of the cells, measuring and charting the cell’s properties. In particular, they were able to measure nanometer scale motions of the cell membrane, which are indicative of the cell’s stiffness and function. The fainter the membrane motion, the less functional the cell, much like how a fainter pulse indicates problems with a patient.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Tests confirm nickel-78 is a 'doubly magic' isotope

Tests confirm nickel-78 is a 'doubly magic' isotope | Amazing Science |

The stability of atoms can vary considerably from one element to the next, and also between isotopes of the same element (whose nuclei contain the same number of protons but different numbers of neutrons). While many isotopes are unstable and rapidly undergo radioactive decay, certain 'magic' isotopes show exceptional stability. Clarifying the properties of these stable isotopes is essential for understanding how the chemical elements formed in the early Universe. In an important step toward verifying various theoretical models, Shunji Nishimura and colleagues from the RIKEN Nishina Center for Accelerator-Based Science have now verified the magic numbers of an enigmatic 'doubly magic' isotope, nickel-78.

The magic numbers for isotope stability are well established for isotopes with similar numbers of protons and neutrons. The seven most widely recognized magic numbers are 2, 8, 20, 28, 50, 82 and 126; these correspond to the number of particles needed to completely fill proton or neutron 'shells' in the nucleus. The nickel-78 (78Ni) isotope contains 28 protons and 50 neutrons, making it doubly magic according to this series. However, isotopes with such a large excess of neutrons compared to protons are predicted to have different magic numbers, and some theoretical models even suggest that 78Ni is not magic at all. Consequently, much attention has been paid to the magic properties of 78Ni in efforts to verify theoretical models of nuclear physics and the formation of heavy elements.

Settling the issue of the magic stability of 78Ni experimentally, however, has proved challenging. "Many experiments have been carried out to identify systematic trends in nuclear properties near 78Ni," says Nishimura. "Yet there has been no clear evidence on whether 78Ni is a double-magic nuclei due to the extremely low production yield of this isotope."

Fortunately, RIKEN's Radioactive Isotope Beam Factory is capable of generating high yields of exotic and rare isotopes like 78Ni (Fig. 1). Using this facility, in combination with the newly developed WAS3ABi detector, the research team was able to perform measurements of 78Ni decay with unprecedented precision. The experiments confirmed the doubly magic status of 78Ni, providing valuable insights into the behavior of exotic nuclei with large neutron excess. Such neutron-rich nuclei play an important role in the production of elements heavier than the most stable element iron, such as gold and uranium. "We hope to solve one of the biggest mysteries of this century—where and how were the heavy elements created in the Universe?" explains Nishimura.

Reference: Xu, Z. Y., Nishimura, S., Lorusso, G., Browne, F., Doornenbal, P., Gey, G., Jung, H.-S., Li, Z., Niikura, M., Söderström, P.-A. et al. "β-decay half-lives of 76,77Co, 79,80Ni, and 81Cu: Experimental indication of a doubly magic 78Ni." Physical Review Letters 113, 032505 (2014). DOI: 10.1103/PhysRevLett.113.032505

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Brain Imaging and Neuroscience: The Good, The Bad, & The Ugly!

Rapid whole-brain imaging with single cell resolution

Rapid whole-brain imaging with single cell resolution | Amazing Science |

A major challenge of systems biology is understanding how phenomena at the cellular scale correlate with activity at the organism level. A concerted effort has been made especially in the brain, as scientists are aiming to clarify how neural activity is translated into consciousness and other complex brain activities.

One example of the technologies needed is whole-brain imaging at single-cell resolution. This imaging normally involves preparing a highly transparent sample that minimizes light scattering and then imaging neurons tagged with fluorescent probes at different slices to produce a 3D representation. However, limitations in current methods prevent comprehensive study of the relationship. A new high-throughput method, CUBIC (Clear, Unobstructed Brain Imaging Cocktails and Computational Analysis), published in Cell, is a great leap forward, as it offers unprecedented rapid whole-brain imaging at single cell resolution and a simple protocol to clear up and make the brain sample transparent, is based on the use of amino-alcohols.

In combination with light sheet fluorescence microscopy, CUBIC was tested for rapid imaging of a number of mammalian systems, such as mouse and primate, showing its scalability for brains of different size. Additionally, it was used to acquire new spatial-temporal details of gene expression patterns in the hypothalamic circadian rhythm center. Moreover, by combining images taken from opposite directions, CUBIC enables whole brain imaging and direct comparison of brains in different environmental conditions.

CUBIC overcomes a number of obstacles compared with previous methods. One is the clearing and transparency protocol, which involves serially immersing fixed tissues into just two reagents for a relatively short time. Second, CUBIC is compatible with many fluorescent probes because of low quenching, which allows for probes with longer wavelengths and reduces concern for scattering when whole brain imaging while at the same time inviting multi-color imaging. Finally, it is highly reproducible and scalable. While other methods have achieved some of these qualities, CUBIC is the first to realize all.

CUBIC provides information on previously unattainable 3D gene expression profiles and neural networks at the systems level. Because of its rapid and high-throughput imaging, CUBIC offers extraordinary opportunity to analyze localized effects of genomic editing. It also is expected to identify neural connections at the whole brain level. In fact, last author Hiroki Ueda is optimistic about further application to even larger mammalian systems. “In the near future, we would like to apply CUBIC technology to whole-body imaging at single cell resolution”.

Via Donald J Bolger
Donald J Bolger's curator insight, August 13, 2014 11:15 AM

This sounds too good to be true!


Scooped by Dr. Stefan Gruenwald!

Humans are wiping out species a thousand times faster than nature can create new ones

Humans are wiping out species a thousand times faster than nature can create new ones | Amazing Science |

Sometimes extinction happens naturally. Other times humans are to blame. Given the many millions of plant and animal species that have ever existed, it’s tough to know exactly how to assign responsibility. But new research indicates that we have an alarmingly large role.

Humans are wiping out species at least 1,000 times faster than nature is creating new species, according to a new study in Conservation Biology (paid access only). And it’s getting much worse. In the future, plants and animal species will go extinct at 10,000 times the rate at which new species emerge, the researchers assert.

Looking at both fossils and genetic variation, the study found that nature snuffs out its own creations much more slowly than we’d realized—at a rate of only one species per every 10 million. Past estimates put the “normal background extinction rate”—the rate at which species would go extinct without human interference—at about 10 yearly extinctions for every 10 million species.

Since mankind hit the scene, however, more than 1,000 out of every 10 million species have been dying out each year. “We’ve known for 20 years that current rates of species extinctions are exceptionally high,” said Stuart Pimm, one of the co-authors and president of the nonprofit organization SavingSpecies. “This new study comes up with a better estimate of the normal background rate—how fast species would go extinct were it not for human actions. It’s lower than we thought, meaning that the current extinction crisis is much worse by comparison.”

Overall species’ diversity grows exponentially richer over time, as branches of news species diverge. The authors liken this to a person’s bank account. Think of your income as the number of new species, while your spending is those that go extinct. Every month when you get paid, your net worth jumps for a while, before spending whittles it down again. If your spending is constant, that monthly spike will rise over time as your salary increases—just as the number of new species should also rise over time. But the authors saw no such increase, implying that extinction is happening far too fast for the pace of new species creation to keep up.

Take birds, for instance. There are 10,000 species of birds, as Pimm explains in a blogpost. At nature’s rate of one extinction per 10 million species, the disappearance of a single bird species should therefore be a once-in-a-millennium event. However, since the year 1500, at least 140 birds have disappeared—including 13 species we only identified after they went extinct.

Paulo Gervasio's curator insight, September 10, 2014 2:39 AM

Why are we blaming ourselves for everything?  There is a biblical story about the tower of Babel.  I am sometimes reminded that maybe we are approaching the arrogance described in that story.  

Scooped by Dr. Stefan Gruenwald!

High-Entropy Alloy: A Metallic Alloy That is Tough and Ductile at Cryogenic Temperatures

High-Entropy Alloy: A Metallic Alloy That is Tough and Ductile at Cryogenic Temperatures | Amazing Science |

A new concept in metallic alloy design – called “high‐entropy alloys” – has yielded a multiple-element material that not only tests out as one of the toughest on record, but, unlike most materials, the toughness as well as the strength and ductility of this alloy actually improves at cryogenic temperatures. This multi-element alloy was synthesized and tested through a collaboration of researchers at the U.S. Department of Energy (DOE)’s Lawrence Berkeley and Oak Ridge National Laboratories (Berkeley Lab and ORNL).

“We examined CrMnFeCoNi, a high‐entropy alloy that contains five major elements rather than one dominant one,” says Robert Ritchie, a materials scientist with Berkeley Lab’s Materials Sciences Division. “Our tests showed that despite containing multiple elements with different crystal structures, this alloy crystalizes as a single phase, face‐centered cubic solid with exceptional damage tolerance, tensile strength above one gigapascal, and fracture toughness values that are off the charts, exceeding that of virtually all other metallic alloys.”

“High‐entropy alloys represent a radical departure from tradition,” Ritchie says, “in that they do not derive their properties from a single dominant constituent or from a second phase. The idea behind this concept is that configurational entropy increases with the number of alloying elements, counteracting the propensity for compound formation and stabilizing these alloys into a single phase like a pure metal.” 

No comment yet.