There's still a lot of mystery surrounding Jupiter's moon Europa, but researchers at NASA seem fairly certain that there's a watery ocean lurking.
Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Scientists working in the desert badlands of northwestern Kenya have found stone tools dating back 3.3 million years, long before the advent of modern humans, and by far the oldest such artifacts yet discovered. The tools, whose makers may or may not have been some sort of human ancestor, push the known date of such tools back by 700,000 years; they also may challenge the notion that our own most direct ancestors were the first to bang two rocks together to create a new technology.
Hominins are a group of species that includes modern humans, Homo sapiens, and our closest evolutionary ancestors. Anthropologists long thought that our relatives in the genus Homo - the line leading directly to Homo sapiens - were the first to craft such stone tools. But researchers have been uncovering tantalizing clues that some other, earlier species of hominin, distant cousins, if you will, might have figured it out.
The researchers do not know who made these oldest of tools. But earlier finds suggest a possible answer: The skull of a 3.3-million-year-old hominin, Kenyanthropus platytops, was found in 1999 about a kilometer from the tool site. A K. platyops tooth and a bone from a skull were discovered a few hundred meters away, and an as-yet unidentified tooth has been found about 100 meters away.
The precise family tree of modern humans is contentious, and so far, no one knows exactly how K. platyops relates to other hominin species. Kenyanthropus predates the earliest known Homo species by a half a million years. This species could have made the tools; or, the toolmaker could have been some other species from the same era, such as Australopithecus afarensis, or an as-yet undiscovered early type of Homo.
GE engineers have made a simple proof-of-concept 3D-printed mini jet engine that operates at 33,000 rotations per minute. The backpack-sized jet engine was built over the course of several years to test the technology’s abilities and to work on a side project together.
The team also designed and developed a fuel nozzle that will be additively manufactured for inclusion in the CFM LEAPjet engine for commercial single-aisle aircraft. The FAA recently approved the first 3D printed component for a version of the GE90 jet engine.
New research has revealed the opah, or moonfish, as the first fully warm-blooded fish that circulates heated blood throughout its body much like mammals and birds, giving it a competitive advantage in the cold ocean depths.
The silvery fish, roughly the size of a large automobile tire, is known from oceans around the world and dwells hundreds of feet beneath the surface in chilly, dimly lit waters. It swims by rapidly flapping its large, red pectoral fins like wings through the water.
Fish that typically inhabit such cold depths tend to be slow and sluggish, conserving energy by ambushing prey instead of chasing it. But the opah's constant flapping of its fins heats its body, speeding its metabolism, movement and reaction times, scientists report in the journal Science.
That warm-blooded advantage turns the opah into a high-performance predator that swims faster, reacts more quickly and sees more sharply, said fisheries biologist Nicholas Wegner of NOAA Fisheries' Southwest Fisheries Science Center in La Jolla, Calif., lead author of the new paper.
"Before this discovery I was under the impression this was a slow-moving fish, like most other fish in cold environments," Wegner said. "But because it can warm its body, it turns out to be a very active predator that chases down agile prey like squid and can migrate long distances."
Wegner realized the opah was unusual when a coauthor of the study, biologist Owyn Snodgrass, collected a sample of its gill tissue. Wegner recognized an unusual design: Blood vessels that carry warm blood into the fish's gills wind around those carrying cold blood back to the body core after absorbing oxygen from water.
The design is known in engineering as "counter-current heat exchange." In opah it means that warm blood leaving the body core helps heat up cold blood returning from the respiratory surface of the gills where it absorbs oxygen. Resembling a car radiator, it's a natural adaptation that conserves heat. The unique location of the heat exchange within the gills allows nearly the fish's entire body to maintain an elevated temperature, known as endothermy, even in the chilly depths.
A rival hacker group to the infamous Lizard Squad has been discovered quietly using a previously unknown global botnet of compromised broadband routers to carry out DDoS and Man-in-the-Middle (MitM) attacks.
The discovery was made by security firm Incapsula (recently acquired by Imperva), which first noticed attacks against a few dozen of its customers in December 2014 since when the firm estimates its size to exceed 40,000 IPs across 1,600 ISPs with at least 60 command and control (C2) nodes.
Almost all of the compromised routers appear to be unidentified ARM-based models from a single US vendor, Ubiquiti, which is sold across the world, including in the UK. Incapsula detected traffic from compromised devices in 109 countries, overwhelmingly in Thailand and router compromise hotspot, Brazil.
The compromise that allowed the Ubiquiti routers to be botted in the first place appears to be connected to one of two vulnerabilities. The first is simply that the devices have been left with their vendor username and password in its default state – perhaps a sign that some of these devices are older – allowing the attackers easy access.
The second and more unexpected flaw is that the routers also allow remote access to HTTP and SSH via default ports, a configuration issue which would be open sesame to attackers. Once compromised, the attacks appear to have been used to inject a number of pieces of malware, mainly the Linux Spike Trojan, aka, ‘MrBlack’, used to configure DDoS attacks. The firm inspected 13,000 malware samples and found evidence of other DDoS tools, including Dorfloo and Mayday.
The C2s for these tools were found to be in several countries, with 73 percent in China and 21 percent in the US. This doesn’t mean the attackers were based there, simply using infrastructure on hosts in those locations.
“Given how easy it is to hijack these devices, we expect to see them being exploited by additional perpetrators. Even as we conducted our research, the Incapsula security team documented numerous new malware types being added—each compounding the threat posed by the existence of these botnet devices,” said the firm’s researchers.
DNA, the genetic material of all living things, is what makes us who we are. Written in this molecular code are the instructions for making proteins, which are the building blocks of cells and thus living organisms. In natural circumstances, this code is formed of four basic units, or “letters”: A, C, G and T (adenine, cytosine, guanine and thymine). These so-called DNA bases pair up, A with T and C with G, forming a long, readable sequence that varies from gene to gene. Textbooks will tell you that it is those four letters that are the recipe for life, or so we long believed.
Back in the ‘80s, scientists threw another base into the mix: 5-methylcytosine (mC). As the name suggests, mC is cytosine with a functional unit called methyl attached. But it was not for another decade that scientists realized the importance of mC: The addition of methyl groups to DNA can switch genes on or off in order to meet the needs of each tissue, given that every cell contains the same DNA sequences. Such modifications are known as epigenetic changes; these allow the environment to affect gene expression, but they also play a role in various diseases. For example, alterations in mC have been shown to contribute to the development of cancer, amongst many other conditions.
But it turns out that the DNA alphabet does not even end here, as in recent years scientists have gradually expanded this list to eight. Now, scientists have just detailed descriptions of one more potential candidate, N6-methyladenine (6mA), in the journal Cell. As with mC, this is composed of the base adenine with a methyl group tacked onto it. While scientists have known about this modified base for some time, it was thought to have exclusively existed in bacteria, where it serves to protect against the unwanted addition of foreign DNA from other organisms.
Now, a group of researchers from IDIBELL-Bellvitge Biomedical Research Institute have found that this is not the case, providing evidence that 6mA is not simply a phenomenon of primitive cells. As described in the journal Cell, scientists found that some more complex cells, called eukaryotic cells, also possess the base. Eukaryotes are organisms within one of the three domains of life, the others being archaea and bacteria.
More specifically, researchers discovered 6mA in three different groups of eukaryotes: green algae, flies and worms. This was made possible through the development of highly sensitive analytical techniques, which picked up the exceedingly low levels of this base that previously eluded detection. Interestingly, newly gathered data indicates that, like mC, 6mA may also have a gene regulatory function in these animals, which could suggest that it also behaves as an epigenetic mark.
Now that scientists have found this base in various organisms, the researchers want to scrutinize our own genomes to see if it also exists in humans. This would be interesting given the fact that evidence seems to suggest that 6mA may play a role in stem cells. If it does indeed exist in our own species, then researchers may have a job on their hands trying to figure out what precise role it plays in the cell.
After wandering around an unfamiliar part of town, can you sense which direction to travel to get back to the subway or your car? If so, you can thank your entorhinal cortex, a brain area recently identified as being responsible for our sense of direction. Variation in the signals in this area might even explain why some people are better navigators than others.
Cloudy days can be a bit of a downer. But when you add them all from nearly 13 years of measurements, the bright side becomes more apparent.
NASA Earth Observatory just published a map that uses data collected between July 2002 and April 2015 to give an unparalleled view of the world’s cloudy (and sunny) spots.
One thing that’s immediately apparent is that the world is a pretty cloudy place. It’s no surprise the U.K.—renowned for its dreary weather—appears in white, indicating frequent clouds. Ditto for the Amazon rainforest, which requires copious clouds for its prodigious rain.
On the flip side, the Sahara, Atacama, Arabian and their fellow deserts (including Antarctica) are basically cloud free. Australia and the western U.S. are also light on cloud cover.
Aside from giving a sense of the globe’s overall cloudiness, the map also reveals key features of the climate system. The band of cloudiness just around the equator generally represents the Intertropical Convergence Zone, a girdle of thunderstorms around the earth that form there thanks to warm, moist air lifting off the ocean. The ITCZ, as it’s known in climatespeak, generally drifts back and forth across the equator with the seasons.
In comparison, dry air generally subsides from 15-30 degrees north and south of the equator. Not surprisingly, that’s where most of the world’s deserts are located.
When it comes to stars, three may not be a crowd. That’s according to a new paper published online before print in Astronomy & Astrophysics, which suggests that almost a quarter of twin star systems might have another sibling nearby.
Astronomers examined the light from nearly 14,000 eclipsing binary stars, fortuitous arrangements in which a stellar pair’s orbit is edge-on from our point of view. Telescopes on Earth can’t see the individual stars, but they do observe dips in light intensity as one goes behind the other. For lone binary stars, those dips occur at regular intervals. But in the presence of a third star, this timing speeds up and slows down as the pair orbits its mate and gets nearer and farther from Earth. Researchers found that 2% of the eclipsing binaries had this signature swing back and forth. However, in another 22%, the team found partial shifts potentially caused by a slower orbit around a companion.
These unfinished oscillations were just as likely to swing faster as slower, ruling out the possibility that the change was due to the interaction of the twin stars. If correct, this finding could modify our understanding of stellar formation, especially because scientists now believe binary stars are even more common than single-star systems. It also makes the skies of some exoplanets more exotic, as in the artist’s illustration of a planet in the three-star system Gliese 667 shown above.
Researchers have identified new molecules that kill cancer cells while protecting healthy cells and that could be used to treat a variety of different cancers. The research shines a light on what happens to cells at the moment they become cancerous. Professor Qing-Bin Lu, from the University of Waterloo's Faculty of Science, initiated a novel molecular-mechanism-based program to discover a new class of non-platinum-based-halogenated molecules that kill cancer cells, yet prevent healthy cells from being damaged.
Femtosecond time-resolved laser spectroscopy is a technique traditionally applied to study chemical reactions as they occur on a molecular level. The laser takes a series of rapid “snapshots” of molecules as they interact and change structure over time. The technique is part of a potential new field of science developed by Professor Lu called femtomedicine (FMD), which integrats the ultrafast laser with molecular biology and cell biology.
Professor Lu has applied the tool to understand the molecular mechanisms that cause cancer at the very moment when the DNA becomes damaged. He has also used it to investigate how radiation therapy and chemotherapy using chemical agents, in particular the widely used platinum chemotherapeutic Cisplatin, work in treating a variety of cancers.
“We know DNA damage is the initial step,” said Professor Lu. “With the novel femtomedicine approach we can go back to the very beginning to find out what causes DNA damage in the first place, then mutation, and then cancer.”
By understanding more about the fundamental mechanisms of the diseases, Professor Lu pre-selected molecules most likely to be effective as anti-cancer agents. In this case, he discovered a new family of non-platinum-based molecules similar in structure to Cisplatin but containing no toxic platinum.
Pre-clinical studies with various cultured human cells as well as on rodents show that these new molecules are effective against cervical, breast, ovarian, and lung cancers. Cisplatin, discovered more than 40 years ago, is an important, widely used platinum-based anti-cancer agent. Unfortunately, the inclusion of platinum in the molecule causes serious side effects like neurotoxicity, kidney damage, hearing loss, nausea and vomiting.
“It is extremely rare to discover anti-cancer agents that can selectively kill cancer cells and protect healthy cells, as well as being effective in treating many different types of cancer and having a novel molecular mechanism of action. These candidate drugs should have a high potential to pass through clinical trials and could ultimately save lives”, said Professor Lu.
Professor Lu has already applied for patents on the new family of non-platinum-based-halogenated molecules that he has discovered and hopes to start clinical trials soon.
Electrons were observed to travel in a solid at an unusually high velocity, which remained the same independent of the electron energy. This anomalous light-like behavior is found in special two-dimensional materials, such as graphene, but is now realized in a three-dimensional bulk material. High-resolution angle-resolved electron spectroscopy, stimulated by synchrotron x-ray radiation, was used to substantiate the theoretically predicted exotic electron structure.
A stable bulk material has been discovered that shows the same physics found in graphene, which illuminated the detailed interactions of electron's orbital motion and its intrinsic magnetic orientation. The new material will be a test ground for theories on how electron interactions in solids shape exotic electron behavior, including the highest electron mobility in bulk materials.
Investigations of electronic behavior have expanded beyond familiar systems of metals, insulators, and semi-conductors into the realm of strongly interacting electrons, which exhibit exotic relationships between the allowed electron velocities and their energy states. A key feature for the new materials is behavior in which the electron velocity does not depend on its energy. Such a relationship is a hallmark of photons, the energetic particles that make up a beam of light. This property is found in the new class of materials exhibiting a strong interaction between the electron trajectory and the electron spin alignment (called "spin-orbit coupling"). Two-dimensional versions of such systems (for example, grapheme) have been recently explored, but the systems are hard to work with because of their ultra-thin film nature.
This work establishes graphene-like electronic behavior in the bulk three-dimensional materials Na3Bi and Cd3As2 and explains their exceptionally high electronic mobility. The required advances in electron spectroscopy techniques, used to investigate the electronic structure, employed an energy tunable bright x-ray source and a high-resolution spectrometer.
A new study shows that the microbial communities we carry in and on our bodies—known as the human microbiome—have the potential to uniquely identify individuals, much like a fingerprint. Harvard T.H. Chan School of Public Health researchers and colleagues demonstrated that personal microbiomes contain enough distinguishing features to identify an individual over time from among a research study population of hundreds of people. The study, the first to rigorously show that identifying people from microbiome data is feasible, suggests that we have surprisingly unique microbial inhabitants, but could raise potential privacy concerns for subjects enrolled in human microbiome research projects. The study appears online May 11, 2015 in the journal PNAS under “Identifying personal microbiomes using metagenomics codes” .
“Linking a human DNA sample to a database of human DNA ‘fingerprints’ is the basis for forensic genetics, which is now a decades-old field. We’ve shown that the same sort of linking is possible using DNA sequences from microbes inhabiting the human body—no human DNA required. This opens the door to connecting human microbiome samples between databases, which has the potential to expose sensitive subject information—for example, a sexually-transmitted infection, detectable from the microbiome sample itself,” said lead author Eric Franzosa, research fellow in the Department of Biostatistics at Harvard Chan.
Franzosa and colleagues used publicly available microbiome data produced through the Human Microbiome Project (HMP), which surveyed microbes in the stool, saliva, skin, and other body sites from up to 242 individuals over a months-long period. The authors adapted a classical computer science algorithm to combine stable and distinguishing sequence features from individuals’ initial microbiome samples into individual-specific “codes.” They then compared the codes to microbiome samples collected from the same individuals’ at follow-up visits and to samples from independent groups of individuals.
The results showed that the codes were unique among hundreds of individuals, and that a large fraction of individuals’ microbial “fingerprints” remained stable over a one-year sampling period. The codes constructed from gut samples were particularly stable, with more than 80% of individuals identifiable up to a year after the sampling period.
Observant aliens visiting Earth and studying its civilizations would probably be pretty obsessed with wheat. They couldn’t fail to note how staggeringly many people we feed with the crop on this planet. “Wheat is one of the main staple crops in the world and provides 20% of daily protein and calories,” notes the Wheat Initiative, a project launched by G20 agricultural ministers. “With a world population of 9 billion in 2050, wheat demand is expected to increase by 60%. To meet the demand, annual wheat yield increases must grow from the current level of below 1% to at least 1.6%.” That’s why the punchline of a new study in the Proceedings of the National Academy of Sciences is pretty troubling. A warming climate, it suggests, could drive wheat yields in the opposite direction – down — in the United States and, possibly, elsewhere.
“The net effect of warming on yields is negative,” write Jesse Tack of the agricultural economics department of Mississippi State University and two colleagues, “even after accounting for the benefits of reduced exposure to freezing temperatures.” That’s no small matter, the study notes, in that wheat is “the largest source of vegetable protein in low-income countries.”
The study compared results from nearly 30 years of winter wheat trials across Kansas — a state that produced $2.8 billion worth of wheat crop in 2013 — with data on weather and precipitation. Winter wheat grows from September to May and faces two major temperature-related threats during this cycle — extreme winter cold, and extreme spring heat. Global warming ought to cut down on the freezing temperatures, but also amp up really hot ones. The study found, however, that on balance, the effect is more negative than positive, with a roughly 15 percent decline in wheat yields under a 2 degrees Celsius warming scenario, rising to around 40 percent with 4 degrees (C) of warming.
As for whether the Kansas-based research can easily be extrapolated to other regions where wheat is grown around the world, that depends highly on the local climate, says lead author Tack. So long as warming creates a situation in which temperatures in a given place more frequently reach 34 degrees Celsius (or 92.3 degrees Fahrenheit) during the growing season, then it could be bad for wheat, based on his study. “The tipping point is 34 degrees Celsius,” says Tack. “In terms of the estimated warming impacts, it’s largely going to be a matter of whether the new climate has increased exposure over 34 degrees.”
A dramatic video has captured the behavior of cytotoxic T cells – the body’s ‘serial killers’ – as they hunt down and eliminate cancer cells before moving on to their next target.
In a study published today in the journal Immunity, a collaboration of researchers from the UK and the USA, led by Professor Gillian Griffiths at the University of Cambridge, describe how specialised members of our white blood cells known as cytotoxic T cells destroy tumour cells and virally-infected cells. Using state-of-the-art imaging techniques, the research team, with funding from the Wellcome Trust, has captured the process on film.
A recent finding by scientists from the Hospital for Sick Children, Toronto, and Duke University challenges long-held ideas about why our bones have a harder time healing as we age. Their research discovered that old mouse bones mend like youthful bones do when they're exposed to young blood after a fracture.
“The traditional concept is that as you get older, your bone cells kind of wear out so they can't heal as well, and we thought we'd find that during this study as well,” explains study co-author Benjamin Alman, of the Hospital for Sick Children. “But it turns out that it's not the bone cells, it's the blood cells. As you get older, the blood cells change the way they behave when you have an injury, and as a result the cells that heal bone aren't able to work as efficiently.”
The researchers paired lab mice, one old and one young, and subjected them to bone fractures, but that wasn't all they had in common. The living animals' circulatory systems were also joined together by a 150-year-old surgical technique known as parabiosis. Scientists removed a layer of skin from each mouse and stitched the exposed surfaces together. As the animals healed their capillaries joined, enabling their two hearts to pump the same blood throughout the two bodies as a single system. Parabiosis, which has been gaining new popularity in aging research, allowed Alman and colleagues to see what impacts the circulating factors of the younger mouse's blood had when introduced into the body of an older mouse.
The experiment, published this week in Nature Communications, suggests that young blood cells secrete some as-yet-unknown molecule, likely a protein or possibly some other chemical, that speeds up the healing of fractured bone. The molecule apparently does so by regulating levels of beta-catenin in bone cells known as osteoblasts. Keeping beta-catenin at the proper levels appears crucial for the formation of new high-density bone.
This ability is greatly diminished in older animals' blood because it no longer secretes the molecule, whose exact chemical nature remains a mystery at this point. “My guess is that there are a number of proteins involved that are made differently as we get older, and that they are responsible for the difficulty in healing bone,” Alman says.
The findings could prove good news for aging humans, but healing our bones won’t require the type of transfusions used in the experiment—nor will it borrow the synthesized “True Blood” variety that may soon enter clinical trials. Sharing human blood in this manner raises a number of red flags ranging from practicality to possible medical complications.
Scientists have found a fossil dating back at least 16 million years of a female shrimplike creature with enormous fossilized sperm in her reproductive tract. It's a unique example of a female that copulated just before she died and started to turn to stone.
The fossil is a display of "ancient sex with gargantuan sperm," says the lead scientist, Renate Matzke-Karasz of German's Ludwig-Maximilian-University, via e-mail. "We have here direct evidence of a recent mating. All the co-authors are still amazed by the findings."
The post-coital specimen is an ancient example of a mussel shrimp, technically known as an ostracod. These tiny animals have hinged shells like a mussel's and live today in watery places from flower pots to the ocean, where they subsist on detritus in the water. The fossil specimens were discovered in an Australian cave where large numbers of bats roosted millions of years ago. The bats unwittingly made a major contribution to science: Their guano, Matzke-Karasz says, supplied chemicals that helped preserve the finest details of the mussel shrimps' anatomy.
The scientists found four fossilized female mussel shrimp and one male mussel shrimp with sperm in their bodies, some of the oldest fossilized sperm found to date. When they examined the fossilized male, "we almost couldn't believe our eyes," Matzke-Karasz says. The animal was replete with "sperm (that) looked like little ropes, exactly how modern ostracod giant sperm look!"
The mussel shrimp may be small, but the modern male is mighty, producing so-called "giant sperm" that can be four times longer than the animal itself. Only a handful of other animals, including some flies and moths, make giant sperm, whose purpose is still unclear.
The new study, appearing in this week's Proceedings of the Royal Society B: Biological Sciences, shows that male mussel shrimp may have been deploying giant sperm for more than 140 million years, says micropaleontologist David Horne of Britain's Queen Mary University of London.
Chinese search giant Baidu says it has invented a powerful supercomputer that brings new muscle to an artificial-intelligence technique giving software more power to understand speech, images, and written language.
The new computer, called Minwa and located in Beijing, has 72 powerful processors and 144 graphics processors, known as GPUs. Late Monday, Baidu released a paper claiming that the computer had been used to train machine-learning software that set a new record for recognizing images, beating a previous mark set by Google.
“Our company is now leading the race in computer intelligence,” said Ren Wu, a Baidu scientist working on the project, speaking at the Embedded Vision Summit on Tuesday. Minwa’s computational power would probably put it among the 300 most powerful computers in the world if it weren’t specialized for deep learning, said Wu. “I think this is the fastest supercomputer dedicated to deep learning,” he said. “We have great power in our hands—much greater than our competitors.”
Computing power matters in the world of deep learning, which has produced breakthroughs in speech, image, and face recognition and improved the image-search and speech-recognition services offered by Google and Baidu.
The technique is a souped-up version of an approach first established decades ago, in which data is processed by a network of artificial neurons that manage information in ways loosely inspired by biological brains. Deep learning involves using larger neural networks than before, arranged in hierarchical layers, and training them with significantly larger collections of data, such as photos, text documents, or recorded speech.
So far, bigger data sets and networks appear to always be better for this technology, said Wu. That’s one way it differs from previous machine-learning techniques, which had begun to produce diminishing returns with larger data sets. “Once you scaled your data beyond a certain point, you couldn’t see any improvement,” said Wu. “With deep learning, it just keeps going up.” Baidu says that Minwa makes it practical to create an artificial neural network with hundreds of billions of connections—hundreds of times more than any network built before.
A paper released Monday is intended to provide a taste of what Minwa’s extra oomph can do. It describes how the supercomputer was used to train a neural network that set a new record on a standard benchmark for image-recognition software. The ImageNet Classification Challenge, as it is called, involves training software on a collection of 1.5 million labeled images in 1,000 different categories, and then asking that software to use what it learned to label 100,000 images it has not seen before.
Software is compared on the basis of how often its top five guesses for a given image miss the correct answer. The system trained on Baidu’s new computer was wrong only 4.58 percent of the time. The previous best was 4.82 percent,reported by Google in March. One month before that, Microsoft had reportedachieving 4.94 percent, becoming the first to better average human performance of 5.1 percent.
Researchers have demonstrated a new metal matrix composite that is so light that it can float on water. A boat made of such lightweight composites will not sink despite damage to its structure. The new material also promises to improve automotive fuel economy because it combines light weight with heat resistance.
Although syntactic foams have been around for many years, this is the first development of a lightweight metal matrix syntactic foam. It is the work of a team of researchers from Deep Springs Technology (DST) and the New York University Polytechnic School of Engineering.
Their magnesium alloy matrix composite is reinforced with silicon carbide hollow particles and has a density of only 0.92 grams per cubic centimeter compared to 1.0 g/cc of water. Not only does it have a density lower than that of water, it is strong enough to withstand the rigorous conditions faced in the marine environment.
Significant efforts in recent years have focused on developing lightweight polymer matrix composites to replace heavier metal-based components in automobiles and marine vessels. The technology for the new composite is very close to maturation and could be put into prototypes for testing within three years. Amphibious vehicles such as the Ultra Heavy-lift Amphibious Connector (UHAC) being developed by the U.S. Marine Corps can especially benefit from the light weight and high buoyancy offered by the new syntactic foams, the researchers explained.
"This new development of very light metal matrix composites can swing the pendulum back in favor of metallic materials," forecasted Nikhil Gupta, an NYU School of Engineering professor in the Department of Mechanical and Aerospace Engineering and the study's co-author. "The ability of metals to withstand higher temperatures can be a huge advantage for these composites in engine and exhaust components, quite apart from structural parts."
The syntactic foam made by DST and NYU captures the lightness of foams, but adds substantial strength. The secret of this syntactic foam starts with a matrix made of a magnesium alloy, which is then turned into foam by adding strong, lightweight silicon carbide hollow spheres developed and manufactured by DST. A single sphere's shell can withstand pressure of over 25,000 pounds per square inch (PSI) before it ruptures—one hundred times the maximum pressure in a fire hose.
Scientists have discovered a way to regrow bone tissue using the protein signals produced by stem cells. This technology could help treat victims who have experienced major trauma to a limb, like soldiers wounded in combat or casualties of a natural disaster. The new method improves on older therapies by providing a sustainable source for fresh tissue and reducing the risk of tumor formation that can arise with stem cell transplants.
The new study, published in Scientific Reports, is is the first to extract the necessary bone-producing growth factors from stem cells and to show that these proteins are sufficient to create new bone. The stem cell-based approach was as effective as the current standard treatment in terms of the amount of bone created.
“This proof-of-principle work establishes a novel bone formation therapy that exploits the regenerative potential of stem cells,” says senior author Todd McDevitt, PhD, a senior investigator at the Gladstone Institutes. “With this technique, we can produce new tissue that is completely stem cell-derived and that performs similarly with the gold standard in the field.”
Digital medicine is poised to transform biomedical research, clinical practice and the commercial sector. Here we introduce a monthly column from R&D/venture creation firm PureTech tracking digital medicine's emergence.
Technology has already transformed the social fabric of life in the twenty-first century. It is now poised to profoundly influence disease management and healthcare. Beyond the hype of the 'mobile health' and 'wearable technology' movement, the ability to monitor our bodies and continuously gather data about human biology suggests new possibilities for both biomedical research and clinical practice. Just as the Human Genome Project ushered in the age of high-throughput genotyping, the ability to automate, continuously record, analyze and share standardized physiological and biological data augurs the beginning of a new era—that of high-throughput human phenotyping.
These advances are prompting new approaches to research and medicine, but they are also raising questions and posing challenges for existing healthcare delivery systems. How will these technologies alter biomedical research approaches, what types of experimental questions will researchers now be able to ask and what types of training will be needed? Will the ability to digitize individual characteristics and communicate by mobile technology empower patients and enable the modification of disease-promoting behaviors; at the same time, will it threaten patient privacy? Will doctors be prescribing US Food and Drug Administration (FDA)-cleared apps on a regular basis, not just to monitor and manage chronic disease but also to preempt acute disease episodes? Will the shift in the balance between disease treatment and early intervention have a broad economic impact on the healthcare system? How will the emergence of these new technologies reshape the healthcare industry and its underlying business models? What will be the defining characteristics of 'winning' products and companies?
These are just some of the questions we plan to ask over the coming months. In the meantime, we introduce here some of the key themes shaping R&D in the digital medicine field and focus on what they might mean for the biopharmaceutical and diagnostic/device industries.
Using sensitive observations from the Kepler space telescope, the researchers have uncovered evidence of daily weather cycles on six extra-solar planets seen to exhibit different phases. Such phase variations occur as different portions of these planets reflect light from their stars, similar to the way our own moon cycles though different phases.
Among the findings are indications of cloudy mornings on four of them and hot, clear afternoons on two others. "We determined the weather on these alien worlds by measuring changes as the planets circle their host stars, and identifying the day-night cycle," said Lisa Esteves, a PhD candidate in the Department of Astronomy & Astrophysics at the University of Toronto, and lead author of the study published today in The Astrophysical Journal.
"We traced each of them going through a cycle of phases in which different portions of the planet are illuminated by its star, from fully lit to completely dark," said Esteves.
Because the planets are very near to their stars, they are expected to rotate counter-clockwise - just as the majority of objects in our solar system do - with the right side moving in the direction of each planet's orbit. This causes an eastward movement of the planet's surface and therefore an eastward circulation of atmospheric winds. As a result, clouds that form on the planet's night side, where temperatures are cooler while it faces away from its host star, would be blown to the planet's morning side.
"As the winds continue to transport the clouds to the day side, they heat up and dissipate, leaving the afternoon sky cloud-free," said Esteves. "These winds also push the hot air eastward from the meridian, where it is the middle of the day, resulting in higher temperatures in the afternoon."
For four of the planets, the researchers saw excess brightness in the Kepler data that corresponds to when the morning side is visible. For the other two, they saw an excess when the evening side is visible. "By comparing the planets' previously determined temperatures to the phase cycle measurements provided by Kepler, we found that the excess brightness on the morning side is most likely generated by reflected starlight," said Esteves. "These four planets are not hot enough to generate this excess light through thermal emission.
"The excess light seen on the two very hot planets can be explained by thermal emission," said Esteves. "A likely explanation is that on these two planets, the winds are moving heat towards the evening side, resulting in the excess brightness."
The Kepler telescope was the ideal instrument for the study of exoplanet phase variations. The very precise measurements it provided and the vast amount of data it collected allowed astronomers to measure the tiny signals from these distant worlds. Most of the planets examined in this study are very hot and large, with temperatures greater than 1600 degrees Celsius and sizes comparable to Jupiter - conditions far from hospitable to life but excellent for phase measurements.
Coulomb interaction has a striking effect on electronic propagation in one-dimensional conductors. The interaction of an elementary excitation with neighboring conductors favors the emergence of collective modes, which eventually leads to the destruction of the Landau quasiparticle. In this process, an injected electron tends to fractionalize into separated pulses carrying a fraction of the electron charge. Here, a team of physicists now use two-particle interferences in the electronic analog of the Hong-Ou-Mandel experiment in a quantum Hall conductor at filling factor 2 to probe the fate of a single electron emitted in the outer edge channel and interacting with the inner one. By studying both channels, they analyze the propagation of the single electron and the generation of interaction-induced collective excitations in the inner channel. These complementary pieces of information reveal the fractionalization process in the time domain and establish its relevance for the destruction of the quasiparticle, which degrades into the collective modes.
When Joseph Dwyer’s aeroplane took a wrong turn into a thundercloud, the mistake paid off: the atmospheric physicist flew not only through a frightening storm but also into an unexpected — and mysterious — haze of antimatter. Although powerful storms have been known to produce positrons — the antimatter versions of electrons — the antimatter observed by Dwyer and his team cannot be explained by any known processes, they say. “This was so strange that we sat on this observation for several years,” says Dwyer, who is at the University of New Hampshire in Durham.
The flight took place six years ago, but the team is only now reporting the result (J. R. Dwyer et al. J. Plasma Phys.; in the press). “The observation is a puzzle,” says Michael Briggs, a physicist at the NASA Marshall Space Flight Center in Huntsville, Alabama, who was not involved in the report.
A key feature of antimatter is that when a particle of it makes contact with its ordinary-matter counterpart, both are instantly transformed into other particles in a process known as annihilation. This makes antimatter exceedingly rare. However, it has long been known that positrons are produced by the decay of radioactive atoms and by astrophysical phenomena, such as cosmic rays plunging into the atmosphere from outer space. In the past decade, research by Dwyer and others has shown that storms also produce positrons, as well as highly energetic photons, or γ-rays.
It was to study such atmospheric γ-rays that Dwyer, then at the Florida Institute of Technology in Melbourne, fitted a particle detector on a Gulfstream V, a type of jet plane typically used by business executives. On 21 August 2009, the pilots turned towards what looked, from its radar profile, to be the Georgia coast. “Instead, it was a line of thunderstorms — and we were flying right through it,” Dwyer says. The plane rolled violently back and forth and plunged suddenly downwards. “I really thought I was going to die.”
During those frightening minutes, the detector picked up three spikes in γ-rays at an energy of 511 kiloelectronvolts, the signature of a positron annihilating with an electron. Each γ-ray spike lasted about one-fifth of a second, Dwyer and his collaborators say, and was accompanied by some γ-rays of slightly lower energy. The team concluded that those γ-rays had lost energy as a result of travelling some distance and calculated that a short-lived cloud of positrons, 1–2 kilometres across, had surrounded the aircraft. But working out what could have produced such a cloud has proved hard. “We tried for five years to model the production of the positrons,” says Dwyer.
Electrons discharging from charged clouds accelerate to close to the speed of light, and can produce highly energetic γ-rays, which in turn can generate an electron–positron pair when they hit an atomic nucleus. But the team did not detect enough γ-rays with sufficient energy to do this.
A new technique for visualizing the rapidly changing electronic structures of atomic-scale materials as they twist, tumble and traipse across the nanoworld is taking shape at the California Institute of Technology. There, researchers have for the first time successfully combined two existing methods to visualize the structural dynamics of a thin film of graphite.
Described this week in the journal Structural Dynamics, from AIP Publishing and the American Crystallographic Association, their approach integrated a highly specific structural analysis technique known as "core-loss spectroscopy" with another approach known as ultrafast four-dimensional (4-D) electron microscopy—a technique pioneered by the Caltech laboratory, which is headed by Nobel laureate Ahmed Zewail.
In core-loss spectroscopy, the high-speed probing electrons can selectively excite core electrons of a specific atom in a material (core electrons are those bound most tightly to the atomic nucleus). The amount of energy that the core electrons gain gives insight into the local electronic structure, but the technique is limited in the time resolution it can achieve—traditionally too slow for fast catalytic reactions. 4-D electron microscopy also reveals the structural dynamics of materials over time by using short pulses of high-energy electrons to probe samples, and it is engineered for ultrafast time resolution.
Combining these two techniques allowed the team to precisely track local changes in electronic structure over time with ultrafast time resolution.
"In this work, we demonstrate for the first time that we can probe deep core electrons with rather high binding energies exceeding 100 eV," said Renske van der Veen, one of the authors of the new study. "We are equipped with an ultrafast probing tool that can investigate, for example, the relaxation processes in photocatalytic nanoparticles, photoinduced phase transitions in nanoscale materials or the charge transfer dynamics at interfaces."
An international team of scientists has announced the discovery of a new state of matter in a material that appears to be an insulator, superconductor, metal and magnet all rolled into one, saying that it could lead to the development of more effective high-temperature superconductors.
Why is this so exciting? Well, if these properties are confirmed, this new state of matter will allow scientists to better understand why some materials have the potential to achieve superconductivity at a relativity high critical temperature (Tc) - "high" as in −135 °C as opposed to −243.2 °C. Because superconductivity allows a material to conduct electricity without resistance, which means no heat, sound, or any other form of energy release, achieving this would revolutionise how we use and produce energy, but it’s only feasible if we can achieve it at so-called high temperatures.
As Michael Byrne explains, when we talk about states of matter, it’s not just solids, liquids, gases, and maybe plasmas that we have to think about. We also have to consider the more obscure states that don’t occur in nature, but are rather created in the lab - Bose–Einstein condensate, degenerate matter, supersolids and superfluids, and quark-gluon plasma, for example.
By introducing rubidium into carbon-60 molecules - more commonly known as 'buckyballs' - a team led by chemist Kosmas Prassides from Tokohu University in Japan was able to change the distance between them, which forced them into a new, crystalline structure. When put through an array of tests, this structure displayed a combination of insulating, superconducting, metallic, and magnetic phases, including a brand new one, which the researchers have named 'Jahn-Teller metals'.
Named after the Jahn-Teller effect, which is used in chemistry to describe how at low pressures, the geometric arrangement of molecules and ions in an electronic state can become distorted, this new state of matter allows scientists to transform an insulator - which can’t conduct electricity - into a conductor by simply applying pressure.
There’s a whole lot of lab-work to be done before this discovery will mean anything for practical energy production in the real world, but that’s science for you. And it’s got people excited already, as chemist Elisabeth Nicol from the University of Guelph in Canada told Hamish Johnston at PhysicsWorld: "Understanding the mechanisms at play and how they can be manipulated to change the Tc surely will inspire the development of new superconducting materials".
Rats have more heart than you might think. When one is drowning, another will put out a helping paw to rescue its mate. This is especially true for rats that previously had a watery near-death experience, say researchers.
Helping behavior is a prosocial behavior whereby an individual helps another irrespective of disadvantages to him or herself. In the present study, scientists examined whether rats would help distressed, conspecific rats that had been soaked with water. In Experiment 1, rats quickly learned to liberate a soaked cagemate from the water area by opening the door to allow the trapped rat into a safe area. Additional tests showed that the presentation of a distressed cagemate was necessary to induce rapid door-opening behavior. In addition, it was shown that rats dislike soaking and that rats that had previously experienced a soaking were quicker to learn how to help a cagemate than those that had never been soaked. In Experiment 2, the results indicated that rats did not open the door to a cagemate that was not distressed. In Experiment 3, the researchers tested behavior when rats were forced to choose between opening the door to help a distressed cagemate and opening a different door to obtain a food reward. Irrespective of how they learned to open the door, in most test trials, rats chose to help the cagemate before obtaining a food reward, suggesting that the relative value of helping others is greater than the value of a food reward. These results suggest that rats can behave prosocially and that helper rats may be motivated by empathy-like feelings toward their distressed cagemate.