NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
Since the 1960s, theatergoers have shelled out for crude 3-D glasses, polarized glasses, and shutter glasses to enhance their viewing experience. These basic devices, used to trick the brain into perceiving an artificial three-dimensional reality, may soon be rendered obsolete with the introduction of new holography technology developed by Tel Aviv University researchers.
Tel Aviv University doctoral students Yuval Yifat, Michal Eitan, and Zeev Iluz have developed highly efficient holography based on nanoantennas that could be used for security as well as medical and recreational purposes. Prof. Yael Hanein, of TAU's School of Electrical Engineering and head of TAU's Center for Nanoscience and Nanotechnology, and Prof. Jacob Scheuer and Prof. Amir Boag of the School of Electrical Engineering, led the development team. Their research, published in the American Chemical Society's publication Nano Letters, uses the parameters of light itself to create dynamic and complex holographic images.
In order to effect a three-dimensional projection using existing technology, two-dimensional images must be "replotted"—rotated and expanded to achieve three-dimension-like vision. But the team's nanoantenna technology permits newly designed holograms to replicate the appearance of depth without being replotted. The applications for the technology are vast and diverse, according to the researchers, who have already been approached by commercial entities interested in the technology.
"We had this interesting idea—to play with the parameters of light, the phase of light," said Yifat. "If we could dynamically change the relation between light waves, we could create something that projected dynamically—like holographic television, for example. The applications for this are endless. If you take light and shine it on a specially engineered nanostructure, you can project it in any direction you want and in any form that you want. This leads to interesting results."
The researchers worked in the lab for over a year to develop and patent a small metallic nanoantenna chip that, together with an adapted holography algorithm, could determine the "phase map" of a light beam. "Phase corresponds with the distance light waves have to travel from the object you are looking at to your eye," said Prof. Hanein. "In real objects, our brains know how to interpret phase information so you get a feeling of depth, but when you look at a photograph, you often lose this information so the photographs look flat. Holograms save the phase information, which is the basis of 3-D imagery. This is truly one of the holy grails of visual technology."
According to the researchers, their methodology is the first of its kind to successfully produce high-resolution holographic imagery that can be projected efficiently in any direction.
"We can use this technology to reflect any desired object," said Prof. Scheuer. "Before, scientists were able to produce only basic shapes—circles and stripes, for example. We used, as our model, the logo of Tel Aviv University, which has a very specific design, and were able to achieve the best results seen yet."
NASA's Cassini spacecraft has obtained the highest-resolution movie yet of a unique six-sided jet stream, known as the hexagon, around Saturn's north pole. The hexagon, which is wider than two Earths, owes its appearance to the jet stream that forms its perimeter. The jet stream forms a six-lobed, stationary wave which wraps around the north polar regions at a latitude of roughly 77 degrees North.
This is the first hexagon movie of its kind, using color filters, and the first to show a complete view of the top of Saturn down to about 70 degrees latitude. Spanning about 20,000 miles (30,000 kilometers) across, the hexagon is a wavy jet stream of 200-mile-per-hour winds (about 322 kilometers per hour) with a massive, rotating storm at the center. There is no weather feature exactly, consistently like this anywhere else in the solar system.
"The hexagon is just a current of air, and weather features out there that share similarities to this are notoriously turbulent and unstable," said Andrew Ingersoll, a Cassini imaging team member at the California Institute of Technology in Pasadena. "A hurricane on Earth typically lasts a week, but this has been here for decades -- and who knows -- maybe centuries."
Weather patterns on Earth are interrupted when they encounter friction from landforms or ice caps. Scientists suspect the stability of the hexagon has something to do with the lack of solid landforms on Saturn, which is essentially a giant ball of gas.
A team of physicists from the Paul-Drude-Institut für Festkörperelektronik (PDI) in Berlin, Germany, NTT Basic Research Laboratories in Atsugi, Japan, and the U.S. Naval Research Laboratory (NRL) has used a scanning tunneling microscope to create quantum dots with identical, deterministic sizes. The perfect reproducibility of these dots opens the door to quantum dot architectures completely free of uncontrolled variations, an important goal for technologies from nanophotonics to quantum information processing as well as for fundamental studies. The complete findings are published in the July 2014 issue of the journal Nature Nanotechnology.
Quantum dots are often regarded as artificial atoms because, like real atoms, they confine their electrons to quantized states with discrete energies. But the analogy breaks down quickly, because while real atoms are identical, quantum dots usually comprise hundreds or thousands of atoms - with unavoidable variations in their size and shape and, consequently, in their properties and behavior. External electrostatic gates can be used to reduce these variations. But the more ambitious goal of creating quantum dots with intrinsically perfect fidelity by completely eliminating statistical variations in their size, shape, and arrangement has long remained elusive.
Creating atomically precise quantum dots requires every atom to be placed in a precisely specified location without error. The team assembled the dots atom-by-atom, using a scanning tunneling microscope (STM), and relied on an atomically precise surface template to define a lattice of allowed atom positions. The template was the surface of an InAs crystal, which has a regular pattern of indium vacancies and a low concentration of native indium adatoms adsorbed above the vacancy sites. The adatoms are ionized +1 donors and can be moved with the STM tip by vertical atom manipulation. The team assembled quantum dots consisting of linear chains of N = 6 to 25 indium atoms; the example shown here is a chain of 22 atoms.
Stefan Fölsch, a physicist at the PDI who led the team, explained that "the ionized indium adatoms form a quantum dot by creating an electrostatic well that confines electrons normally associated with a surface state of the InAs crystal. The quantized states can then be probed and mapped by scanning tunneling spectroscopy measurements of the differential conductance." These spectra show a series of resonances labeled by the principal quantum number n. Spatial maps reveal the wave functions of these quantized states, which have n lobes and n - 1 nodes along the chain, exactly as expected for a quantum-mechanical electron in a box. For the 22-atom chain example, the states up to n = 6 are shown.
Sexual reproduction is an ancient feature of life on earth, and the familiar X and Y chromosomes in humans and other model species have led to the impression that sex determination mechanisms are old and conserved. In fact, males and females are determined by diverse mechanisms that evolve rapidly in many taxa. Yet this diversity in primary sex-determining signals is coupled with conserved molecular pathways that trigger male or female development. Conflicting selection on different parts of the genome and on the two sexes may drive many of these transitions, but few systems with rapid turnover of sex determination mechanisms have been rigorously studied. Here we survey our current understanding of how and why sex determination evolves in animals and plants and identify important gaps in our knowledge that present exciting research opportunities to characterize the evolutionary forces and molecular pathways underlying the evolution of sex determination.
A restored functional cornea following transplantation of human ABCB5-positive limbal stem cells to limbal stem cell-deficient mice.
Limbal stem cells, which reside in the eye’s limbus, help maintain and regenerate corneal tissue. Their loss due to injury or disease is one of the leading causes of blindness.
In the past, tissue or cell transplants have been used to help the cornea regenerate, but it was unknown whether there were actual limbal stem cells in the grafts, or how many, and the outcomes were not consistent.
ABCB5 allowed the researchers to locate hard-to-find limbal stem cells in tissue from deceased human donors and use these stem cells to regrow anatomically correct, fully functional human corneas in mice.
“Limbal stem cells are very rare, and successful transplants are dependent on these rare cells,” says Bruce Ksander, Ph.D., of Mass. Eye and Ear, co-lead author on the study with post-doctoral fellow Paraskevi Kolovou, M.D. “This finding will now make it much easier to restore the corneal surface. It’s a very good example of basic research moving quickly to a translational application.”
ABCB5 was originally discovered in the lab of Markus Frank, M.D., of Boston Children’s Hospital, and Natasha Frank, M.D., of the VA Boston Healthcare System and Brigham and Women’s Hospital (co-senior investigators on the study) as being produced in tissue precursor cells in human skin and intestine.
In the new work, using a mouse model developed by the Frank lab, they found that ABCB5 also occurs in limbal stem cells and is required for their maintenance and survival, and for corneal development and repair. Mice lacking a functional ABCB5 gene lost their populations of limbal stem cells, and their corneas healed poorly after injury.
“ABCB5 allows limbal stem cells to survive, protecting them from apoptosis [programmed cell death],” says Markus Frank. “The mouse model allowed us for the first time to understand the role of ABCB5 in normal development, and should be very important to the stem cell field in general.” according to Natasha Frank.
Markus Frank is working with the biopharmaceutical industry to develop a clinical-grade ABCB5 antibody that would meet U.S. regulatory approvals.
Spin-coating a polymer solution (green) to create a carbon nanosheet with characteristics similar to graphene, without the defects (black).
A team of Korean researchers has synthesized hexagonal carbon nanosheets similar to graphene, using a polymer. The new material is free of the defects and complexity involved in producing graphene, and can substitute for graphene as transparent electrodes for organic solar cells and in semiconductor chips, the researchers say.
The research team is led by Han-Ik Joh at Korea Institute of Science and Technology (KIST), Seok-In Na at Chonbuk National University, and Byoung Gak Kim at Korea Research Institute of Chemical Technology. The research was funded by the KIST Proprietary Research Project and National Research Foundation of Korea.
Na explains: "Through a catalyst- and transfer-free process, we fabricated indium tin oxide (ITO)-free organic solar cells (OSCs) using a carbon nanosheet (CNS) with properties similar to graphene. The morphological and electrical properties of the CNS is derived from a polymer of intrinsic microporosity-1 (PIM-1), which is mainly composed of several aromatic hydrocarbons and cycloalkanes, can be easily controlled by adjusting the polymer concentration. The CNSs, which are prepared by simple spin-coating and heat-treatment on a quartz substrate, are directly used as the electrodes of ITO-free OSCs, showing a high efficiency of approximately 1.922% under 100 mW cm−2 illumination and air mass 1.5 G conditions. This catalyst- and transfer-free approach is highly desirable for electrodes in organic electronics."
An international team of astronomers has developed a 3D model of a giant cloud ejected by the massive binary system Eta Carinae during its 19th century outburst. Eta Carinae lies about 7,500 light-years away in the southern constellation of Carina and is one of the most massive binary systems astronomers can study in detail. The smaller star is about 30 times the mass of the sun and may be as much as a million times more luminous. The primary star contains about 90 solar masses and emits 5 million times the sun's energy output. Both stars are fated to end their lives in spectacular supernova explosions.
Between 1838 and 1845, Eta Carinae underwent a period of unusual variability during which it briefly outshone Canopus, normally the second-brightest star. As a part of this event, which astronomers call the Great Eruption, a gaseous shell containing at least 10 and perhaps as much as 40 times the sun's mass was shot into space. This material forms a twin-lobed dust-filled cloud known as the Homunculus Nebula, which is now about a light-year long and continues to expand at more than 1.3 million mph (2.1 million km/h).
Using the European Southern Observatory's Very Large Telescope and its X-Shooter spectrograph, the team imaged near-infrared, visible and ultraviolet wavelengths along 92 separate swaths across the nebula, making the most complete spectral map to date. The researchers have used the spatial and velocity information provided by this data to create the first high-resolution 3D model of the Homunculus Nebula.
The shape model was developed using only a single emission line of near-infrared light emitted by molecular hydrogen gas. The characteristic 2.12-micron light shifts in wavelength slightly depending on the speed and direction of the expanding gas, allowing the team to probe even dust-obscured portions of the Homunculus that face away from Earth.
Scientists at the Massachusetts Institute of Technology are developing an audio reading device to be worn on the index finger of people whose vision is impaired, giving them affordable and immediate access to printed words.
The so-called FingerReader, a prototype produced by a 3-D printer, fits like a ring on the user’s finger, equipped with a small camera that scans text. A synthesized voice reads words aloud, quickly translating books, restaurant menus and other needed materials for daily living, especially away from home or office.
Reading is as easy as pointing the finger at text. Special software tracks the finger movement, identifies words and processes the information. The device has vibration motors that alert readers when they stray from the script, said Roy Shilkrot, who is developing the device at the MIT Media Lab.
For Jerry Berrier, 62, who was born blind, the promise of the FingerReader is its portability and offer of real-time functionality at school, a doctor’s office and restaurants.
“When I go to the doctor’s office, there may be forms that I wanna read before I sign them,” Berrier said.
He said there are other optical character recognition devices on the market for those with vision impairments, but none that he knows of that will read in real time.
Nanoscribe GmbH, a spin-off of Karlsruhe Institute of Technology (KIT), has built the world’s fastest 3D printer of micro- and nanostructures.
At the Photonics West, the leading international fair for photonics taking place in San Francisco (USA) this week, Nanoscribe GmbH, a spin-off of Karlsruhe Institute of Technology (KIT), presents the world’s fastest 3D printer of micro- and nanostructures. With this printer, smallest three-dimensional objects, often smaller than the diameter of a human hair, can be manufactured with minimum time consumption and maximum resolution. The printer is based on a novel laser lithography method.
“The success of Nanoscribe is an example of KIT’s excellent entrepreneurial culture and confirms our strategy of specifically supporting spin-offs. In this way, research results are transferred rapidly and sustainably to the market,” says Dr. Peter Fritz, KIT Vice President for Research and Innovation. In early 2008, Nanoscribe was founded as the first spin-off of KIT and has since established itself as the world’s market and technology leader in the area of 3D laser lithography.
Last year, 18 spin-offs were established at KIT. The 3D laser litho-graphy systems developed by Nanoscribe – the spin-off can still be found on KIT’s Campus North - are used for research by KIT and scientists worldwide. Work in the area of photonics concentrates on replacing conventional electronics by optical circuits of higher performance. For this purpose, Nanoscribe systems are used to print polymer waveguides reaching data transfer rates of more than 5 terabits per second.
Biosciences produce tailored scaffolds for cell growth studies among others. In materials research, functional materials of enhanced performance are developed for lightweight construction to reduce the consumption of resources. Among the customers are universities and research institutions as well as industrial companies.
Increased Speed: Hours Turn into Minutes
By means of the new laser lithography method, printing speed is increased by factor of about 100. This increase in speed results from the use of a galvo mirror system, a technology that is also applied in laser show devices or scanning units of CD and DVD drives. Reflecting a laser beam off the rotating galvo mirrors facilitates rapid and precise laser focus positioning. “We are revolutionizing 3D printing on the micrometer scale. Precision and speed are achieved by the industrially established galvo technology. Our product benefits from more than one decade of experience in photonics, the key technology of the 21st century,” says Martin Hermatschweiler, the managing director of Nanoscribe GmbH.
After decades with the title, an extinct bird loses its claim to the widest wing span in history.
When South Carolina construction workers came across the giant, winged fossil at the Charleston airport in 1983, they had to use a backhoe to pull the bird, which lived about 25 million years ago, up from the earth.
But if the bird was actually a brand-new species, researchers faced a big question: Could such a large bird, with a wingspan of 20 to 24 feet, actually get off the ground? After all, the larger the bird, the less likely its wings are able to lift it unaided.
The answer came from Dan Ksepka, paleontologist and science curator at the Bruce Museum in Greenwich, Conn.
Pelagornis sandersi relied on the ocean to keep it aloft. Similar in many ways to a modern-day albatross — although with at least twice the wingspan and very different in appearance, Ksepka said — the bird probably needed a lot of help to fly. It had to run downhill into a head wind, catching the air like a hang glider. Once airborne, it relied on air currents rising from the ocean to keep it gliding.
By observing specific X-ray emissions from iron atoms in the core of supernova remnants, astronomers developed a new technique that provides a clear and rapid means of classifying supernova remnants.
An international team of astronomers using data from the Japan-led Suzaku X-ray observatory has developed a powerful technique for analyzing supernova remnants, the expanding clouds of debris left behind when stars explode. The method provides scientists with a way to quickly identify the type of explosion and offers insights into the environment surrounding the star before its destruction.
“Supernovae imprint their remnants with X-ray evidence that reveals the nature of the explosion and its surroundings,” said lead researcher Hiroya Yamaguchi, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Thanks to Suzaku, we are now learning how to interpret these signals.”
The technique involves observing specific X-ray emissions from iron atoms in the core of supernova remnants. Even after thousands of years, these atoms remain extremely hot, stripped of most of the 26 electrons that accompany iron atoms under normal conditions on Earth. The metal is formed in the centers of shattered stars toward the end of their energy-producing lives and in their explosive demise, which makes it a key witness to stellar death.
“Because Suzaku has a better sensitivity to iron emission lines than any other X-ray mission, it’s the ideal tool for investigating supernova remnants at these energies,” said Robert Petre, chief of Goddard’s X-ray Astrophysics Laboratory and a member of the study team. Suzaku was launched into Earth orbit in 2005, the fifth in a series of Japanese X-ray astronomy satellites. It was developed and is operated cooperatively by the United States and Japan.
Astronomers estimate that a supernova occurs once or twice a century in our home galaxy, the Milky Way. Each time, a blast wave and a shell of hot stellar debris expands rapidly away from the detonation, creating a supernova remnant that can be detected for tens of thousands of years. The expanding cloud slows over time as it mixes with interstellar gas and eventually becomes indistinguishable from it.
For decades, scientists have tried to understand the complex and gruesome relationship between the parasitic emerald wasp Ampulex compressa and its much larger victim, the common household cockroach Periplaneta americana.
At first glance, this parasite-prey relationship seems much like any other: the female wasp stings the cockroach, lays an egg on its abdomen, and once hatched, the hungry larva feeds on the cockroach. However, while most parasitic insects tend to paralyse their victims with a venomous sting, the emerald wasp instead manipulates the cockroach’s behaviour, essentially transforming it into a zombie slave.
With two stings the cockroach is left with the ability to walk, but is entirely robbed of the power to initiate its own movement. The wasp, now tired after administering two stings, regains its energy by cutting off the ends of the cockroach’s antennae, and drinking its blood. Revitalised, it then latches on to the stung cockroach’s antennae and, much like an obedient toddler being lead to his first day of school, the submissive insect follows the wasp’s orders.
The first sting, administered to a mass of nerve tissue in the cockroach’s thorax, contains large quantities of gamma amino butyric acid (GABA), and complementary chemicals called taurine and beta alanine. GABA is a neurotransmitter that blocks the transmission of motor signals between nerves, and, together with the other two chemicals, it temporarily paralyses the cockroach’s front legs. This prevents the cockroach from escaping while the wasp inflicts the second, more toxic sting directly into the roach’s brain.
It is the second sting that turns the cockroach into a zombie, and contains what Frederic Libersat and his colleagues at Ben Gurion University refer to as a “neurotoxic cocktail”. The venom of the second sting blocks the receptors for another neurotransmitter called octopamine, which is involved in the initiation of spontaneous and complex movements such as walking.
Libersat has shown that unstung cockroaches injected with an octopamine-like compound show an increase in walking behaviour. Those injected with a chemical that blocks octopamine, however, show a reduction in spontaneous walking, much like the victims of the wasp sting. Zombie cockroaches were also able to recover from their stupor and walk after they were injected with a chemical that reactivates octopamine receptors.
A spider-like creature's remains were so well preserved in fossil form that scientists could see all its leg joints, allowing them to recreate its likely gait using computer graphics.
Known as a trigonotarbid, the animal was one of the first predators on land. Its prey were probably early flightless insects and other invertebrates, which it would run down and jump on.
"We know quite a bit about how it lived," said Russell Garwood, a palaeontologist with the University of Manchester, UK. "We can see from its mouth parts that it pre-orally digested its prey - something that most arachnids do - because it has a special filtering plate in its mouth. So, that makes us fairly sure it vomited digestive enzymes on to its prey and then sucked up liquid food," he explained.
The trigonotarbid specimens studied by Dr Garwood and colleagues are just a few millimetres in length. They were unearthed in Scotland, near the Aberdeenshire town of Rhynie. Its translucent Early Devonian chert sediments are renowned for their exquisite fossils.
The team used a collection held at the Natural History Museum in London that have actually been prepared since the 1920s. The rock had been cut into extremely fine slices just a few tens of microns thick, making it possible to construct 3D models of the arachnids, much like a doctor might do with the X-ray slices obtained in a CAT scan.
"We could see the articulation points in the legs," explained Dr Garwood. "Between each part of the leg, there are darker pieces where they join, and that allowed us to work out the range of movement.
"We then compared that with the gaits of modern spiders, which are probably a good analogy because they have similar leg proportions. The software enabled us to see the centre of mass and find a gait that worked. If it's too far back compared to the legs, the posterior drags on the ground. The trigonotarbid is an alternating tetrapod, meaning there are four feet on the ground at any one time."
"This new study has gone further and shows us how they probably walked. For me, what's really exciting here is that scientists themselves can make these animations now, without needing the technical wizardry (and immense costs) of a Jurassic-Park style film. When I started working on fossil arachnids, we were happy if we could manage a sketch of what they used to look like. Now, they run across our computer screens."
The work is part of a special collection of papers on 3D visualisations of fossils published in the Journal of Paleontology.
A look at three leading approaches using inlays to expand presbyopic patients’ range of vision.
Periodically, the search for a “cure” for presbyopia produces a new set of treatment options. The latest approach is the corneal inlay, intended to improve near vision without compromising distance vision in emmetropic presbyopes—and possibly non-emmetropes as well.
Three variations on the concept of placing an implant inside the cornea are in different stages of the approval process. The Kamra inlay (from AcuFocus in Irvine, Calif.) uses the pinhole principle to increase depth of field; the Raindrop (from ReVision Optics in Laguna Hills, Calif.) makes the cornea multifocal by reshaping it; and the Flexivue Microlens (from Presbia in Amsterdam) creates multifocal vision using an in-cornea lens.
Closer to a Presbyopia Cure? “All of these inlays seem to work,” notes Dr. Hovanesian. “You can make theoretical arguments as to why one might be better than the others, but they all seem to achieve a high level of near vision in the range of J1, while only minimally compromising distance vision to 20/20 or 20/25.”
“Overall, the data from the FDA trial of the Kamra, like the data from outside the United States regarding the Flexivue, indicates that these inlays are very safe,” adds Dr. Maloney.
Of course, they have a few disadvantages. Dr. Maloney notes that all of them reduce distance vision to some degree. “That’s the trade-off for improved reading vision,” he says. “And all of them cause night glare to some degree; that’s the trade-off for changing the way the eye focuses light. So if patients aren’t happy, it’s because their night vision isn’t good enough, their distance vision isn’t good enough, or their reading vision isn’t good enough—the inlay isn’t strong enough to give them the reading vision they need. Those limitations are probably common to all inlays. But the inlays can be explanted, and vision returns to being very close to what it was before surgery. In addition, we haven’t seen significant adverse effects with the current generation of these inlays.”
“Using an inlay requires a compromise in distance vision,” agrees Dr. Hovanesian. “That’s the nature of adding something to an emmetropic visual system. However, you’re usually doing it in the nondominant eye in a patient who is a good adapter. For most of these patients, what they sacrifice is well worth it for what they gain.
“The Raindrop inlay, and inlays in general, are going to serve a very important purpose,” he concludes. “As they become approved, we’re going to find that patients really want this kind of technology. It’s appealing because it serves emmetropic presbyopes—patients who are not well served by any other modality we have. Many of these patients are not willing to try monovision, and they’re generally too young for lens implant surgery. They want a quick and easy solution, and they like the idea of something that’s reversible if it doesn’t work out.”
“I think there will definitely be a place for these inlays in our clinical practices,” agrees Dr. Maloney. “It looks like the Kamra inlay is the one closest to FDA approval, but as a surgeon I’d be very happy to add any one of them to my practice.”
1. Tomita M, Kanamori T, et al. Simultaneous corneal inlay implantation and laser in situ keratomileusis for presbyopia in patients with hyperopia, myopia, or emmetropia: Six-month results. J Cataract Refract Surg 2012;38:495-506. 2. Tomita M, Kanamori T, et al. Small-aperture corneal inlay implantation to treat presbyopia after laser in situ keratomileusis. J Cataract Refract Surg 2013;39:898-905. 3. Waring GO 4th. Correction of presbyopia with a small aperture corneal inlay. J Refract Surg 2011;27:842-5. 4. Seyeddain O, Hohensinn M, et al. Small-aperture corneal inlay for the correction of presbyopia: 3-year follow-up. J Cataract Refract Surg 2012;38:35-45. 5. Chayet A, Garza EB. Combined hydrogel inlay and laser in situ keratomileusis to compensate for presbyopia in hyperopic patients: One-year safety and efficacy. J Cataract Refract Surg 2013;39:1713-21. 6. Garza EB, Gomez S, Chayet A, Dishler J. One-year safety and efficacy results of a hydrogel inlay to improve near vision in patients with emmetropic presbyopia. J Refract Surg 2013;29:166-72. 7. Limnopoulou AN, Bouzoukis DI, et al. Visual outcomes and safety of a refractive corneal inlay for presbyopia using femtosecond laser. J Refract Surg 2013;29:12-8. 8. Jackson GR, Owsley C, McGwin G Jr. Aging and dark adaptation. Vision Res 1999;39:3975-82. 9. King BR, Fogel SM, et al. Neural correlates of the age-related changes in motor sequence learning and motor adaptation in older adults. Front Hum Neurosci 2013;7:142. 10. Yılmaz OF, Alagöz N, et al. Intracorneal inlay to correct presbyopia:Long-term results. J Cataract Refract Surg 2011;37:1275-1281. 11.Alió JL, Abbouda A, et al. Removability of a small aperture intracorneal inlay for presbyopia correction. J Refract Surg 2013;29:8:550-6.
Researchers at the University College London (UCL) used a supercomputer to compute 10-billion “transition lines” of the spectral signature of methane, 200 times more comprehensive than previous best efforts. As methane is a biosignature, the development is an advancement toward the detection of life in planets outside our solar system.
Every molecule absorbs and emits light in a characteristic pattern called the absorption and emission spectrum. In order to determine the atmospheric composition of the exoplanets, astronomers break down the full atmospheric spectrum into known patterns to identify the component molecules.
Detecting methane in the study of astrobiology is important because it is an unstable molecule in the atmosphere that only lasts 300-600 years as it is broken down by solar ultraviolet radiation. Since it is unstable, one explanation for its presence on an exoplanet’s surface is continual production by a carbon-based life. The caveat is that geological processes also replenish methane so detection is suggestive of but does not guarantee life.
The previous methane spectra are incomplete in that they contain far fewer transition lines than the new effort so do not properly reflect methane in high temperature atmospheres (i.e. hotter than Earth). At high temperatures there are more transitions because the methane molecule is excited to higher energy states. As a result the methane levels of hot exoplanets and cool stars are detected only partially or incorrectly.
An international team of researchers, including University of Hawaii at Manoa astronomer Brent Tully, has mapped the motions of structures of the nearby universe in greater detail than ever before. The maps are presented as a video, which provides a dynamic three-dimensional representation of the universe through the use of rotation, panning, and zooming. The video was announced last week at the conference "Cosmic Flows: Observations and Simulations" in Marseille, France, that honored the career and 70th birthday of Tully.
The Cosmic Flows project has mapped visible and dark matter densities around our Milky Way galaxy up to a distance of 300 million light-years.
The team includes Helene Courtois, associate professor at the University of Lyon, France, and associate researcher at the Institute for Astronomy (IfA), University of Hawaii (UH) at Manoa, USA; Daniel Pomarede, Institute of Research on Fundamental Laws of the Universe, CEA/Saclay, France; Brent Tully, IfA, UH Manoa; and Yehuda Hoffman, Racah Institute of Physics, University of Jerusalem, Israel.
The large-scale structure of the universe is a complex web of clusters, filaments, and voids. Large voids—relatively empty spaces—are bounded by filaments that form superclusters of galaxies, the largest structures in the universe. Our Milky Way galaxy lies in a supercluster of 100,000 galaxies.
Just as the movement of tectonic plates reveals the properties of Earth's interior, the movements of the galaxies reveal information about the main constituents of the Universe: dark energy and dark matter. Dark matter is unseen matter whose presence can be deduced only by its effect on the motions of galaxies and stars because it does not give off or reflect light. Dark energy is the mysterious force that is causing the expansion of the universe to accelerate.
Whitehead Institute scientists have genetically and enzymatically modified red blood cells to carry a range of valuable payloads—from drugs, to vaccines, to imaging agents—for delivery to specific sites throughout the body.
“We wanted to create high-value red cells that do more than simply carry oxygen,” says Whitehead Founding Member Harvey Lodish, who collaborated with Whitehead Member Hidde Ploegh in this pursuit. “Here we’ve laid out the technology to make mouse and human red blood cells in culture that can express what we want and potentially be used for therapeutic or diagnostic purposes.”
The work, published this week in the Proceedings of the National Academy of Sciences (PNAS), combines Lodish’s expertise in the biology of red blood cells (RBCs) with biochemical methods developed in Ploegh’s lab.
RBCs are an attractive vehicle for potential therapeutic applications for a variety of reasons, including their abundance—they are more numerous than any other cell type in the body—and their long lifespan (up to 120 days in circulation). Perhaps most importantly, during RBC production, the progenitor cells that eventually mature to become RBCs jettison their nuclei and all DNA therein. Without a nucleus, a mature RBC lacks any genetic material or any signs of earlier genetic manipulation that could result in tumor formation or other adverse effects.
Exploiting this characteristic, Lodish and his lab introduced genes coding for specific slightly modified normal red cell surface proteins into early-stage RBC progenitors. As the RBCs approach maturity and enucleate, the proteins remain on the cell surface, where they are modified by Ploegh’s protein-labeling technique. Referred to as “sortagging,” the approach relies on the bacterial enzyme sortase A to establish a strong chemical bond between the surface protein and a substance of choice, be it a small-molecule therapeutic or an antibody capable of binding a toxin. The modifications leave the cells and their surfaces unharmed.
“Because the modified human red blood cells can circulate in the body for up to four months, one could envision a scenario in which the cells are used to introduce antibodies that neutralize a toxin,” says Ploegh. “The result would be long-lasting reserves of antitoxin antibodies.”
"Ten years from today the probability that you are depending on wires hanging on tree branches is as likely as that you'll still be installing land lines for telephones. Close to zero."
Inventor Dean Kamen is planning a 2.5 kW home version of hisDeka Research Beacon 10 Stirling engine that could provide efficient around-the-clock power or hot water to a home or business, reports Forbes. Kamen says the current Beacon is intended for businesses like laundries or restaurants that use a lot of hot water. “With commercialization partner NRG Energy, he’s deployed roughly 20 of the machines and expects to put them into production within 18 months,” says Forbes.
But Kamen has bigger plans: feeding excess power to the grid by networking devices across a region together. Depending on the price of natural gas, “ten years from today the probability that you are depending on wires hanging on tree branches is as likely as that you’ll still be installing land lines for telephones,” he says. “Close to zero.”
IBM has announced that it expects to have commercialised its carbon nanotube transistor technology in the early 2020s, thanks to a new design that would allow the transistors to be built on silicon wafers using similar techniques to existing chip manufacturing plants.
The semiconductor industry has been working hard for the last few decades on following Moore's Law, the observation by Intel co-founder Gordon Moore that the number of transistors on a chip tends to double roughly every eighteen months. In recent years, following that trend has become increasingly complex: the ever-shrinking size of the components and the distance between them makes manufacturing difficult, while interference between components must be corrected and designed out.
One possible solution is a move away from traditional semiconductor designs, and numerous companies are working on exactly that. Back in 2012, IBM announced the creation of a 9nm carbon nanotube resistor, dropping below the 10nm barrier for the first time. In September last year, the company further announced that it had used the transistors to build a fully working computer for the first time, but the company was still silent as to when the technology would be likely to leave the lab and reach shop shelves.
Speaking to MIT's Technology Review, IBM researchers have finally given themselves a deadline: to have commercialised carbon nanotube transistor semiconductors by the early 2020s. The secret is a shift in design, featuring six nanotubes measuring 1.4nm in width lined up in parallel, to build the transistors. This design, the company has claimed, could potentially be manufactured using current semiconductor fabrication plants with little modification - the route-to-market the technology desperately needed.
Soil moisture, the water contained within soil particles, is an important player in Earth's water cycle. It is essential for plant life and influences weather and climate. Satellite readings of soil moisture will help scientists better understand the climate system and have potential for a wide range of applications, from advancing climate models, weather forecasts, drought monitoring and flood prediction to informing water management decisions and aiding in predictions of agricultural productivity.
Launched June 10, 2011, aboard the Argentinian spacecraft Aquarius/Satélite de Aplicaciones Científicas (SAC)-D, Aquarius was built to study the salt content of ocean surface waters. The new soil wetness measurements were not in the mission's primary science objectives, but a NASA-funded team led by U.S. Department of Agriculture (USDA) researchers has developed a method to retrieve soil moisture data from the instrument's microwave radiometer.
The Aquarius measurements are considerably coarser in spatial resolution than the measurements from the upcoming NASASoil Moisture Active Passive (SMAP) mission, which was specifically designed to provide the highest quality soil moisture measurements available, including a spatial resolution 10 times that offered by Aquarius.
Soils naturally radiate microwaves and the Aquarius sensor can detect the microwave signal from the top 2 inches (5 centimeters) of the land, a signal that subtly varies with changes in the wetness of the soil. Aquarius takes eight days to complete each worldwide survey of soil moisture, although with gaps in mountainous or vegetated terrain where the microwave signal becomes difficult to interpret.
The first private mission to the Red Planet aims to carry selfies with it.
The first human visitors to Mars may have a warm reminder of home to greet them when they land. With the help of major partners and the nonprofit Explore Mars, a team of college students is now raising money to launch the first private mission to the Red Planet. It won't carry people but messages.
For a 99-cent contribution, anyone can send their submissions of images, messages, audio clips, and videos to put on the time capsule. The team hopes to send the $25 million project to Mars in five years, far ahead of any manned mission.
The idea germinated at a TGI Fridays. Emily Briere, a Duke University engineering major, had just attended the Humans to Mars Summit in Washington, D.C., and was chatting with her family and her family friend--inventor and space enthusiast Eric Knight--about how to breathe some life and excitement into some of the advanced technologies they had just learned about.
The group decided that a time capsule project--the challenge of creating a small craft that could land and survive on Mars, and maintain data from Earth--was the perfect mission: It would educate and excite a large number of people about space travel and unite people globally around a common goal. Importantly for raising the required money, it would also be useful in testing different technologies used to get the capsule to Mars.
“Our goal was to create almost an Apollo-era level of excitement,” says Briere, who is now a rising junior at Duke and serves as mission director for the Time Capsule to Mars project.
Partnered with MIT’s Space Propulsion Lab, the group will test the newest space engine technology--ion electrospray propulsion--which could reduce travel time to Mars to as little as four months. It will also test out other designs, including “delay tolerant networking" systems that will help the capsule transmit data from deep space and optical quartz storage technology that will encode terabytes of time capsule data for millions of years. According to Briere, the biggest current technological challenge is maintaining communications with the capsule from 140 million miles away--they plan to try out inflatable antennas.
The project has already gained the support and advising of partners including Lockheed Martin, NASA, Stanford, Duke, UConn, and MIT, among other organizations, as well as two former NASA chief astronauts. Students at a growing network of universities are collaborating on the design, technical, business, and marketing plans (Briere's three siblings are also involved). The group is also focusing on an educational mission “to close the gap that currently exists between student interest and the opportunities available to advance space exploration.” Individuals will be able to take part in the mission through virtual Mission Control portals and get involved in scientific experiments and data that will be collected on board.
A checklist for the requirements of life as scientists define it could help ground speculation about the possibilities of alien life on distant worlds, new research suggests. Astronomers have confirmed the existence of more than 1,700 planets beyond the solar system, and may soon prove the existence of thousands more of such exoplanets.
"As we find more and more exoplanets, we are certainly going to discover worlds that resemble Earth to some degree," said study author Chris McKay, an astrobiologist at NASA Ames Research Center in Moffett Field, California. "This raises the question of whether or not such exoplanets could support life and what kind of life might live there."
To understand whether life might exist on alien worlds, McKay suggested scientists should evaluate both the requirements for life on Earth and the limits of life on Earth. Although scientific understanding of the requirements for life has not changed in many years, researchers' thoughts on the limits of life are have altered significantly in the past few decades, McKay said.
There are four general categories of the requirements for life on Earth: energy, carbon, liquid water and miscellaneous factors, McKay said.
The energy for life on Earth all comes from the shuffling of electrons from molecule to molecule by chemical reactions, which is driven by light-absorbing proteins in the case of photosynthesis. Carbon is the backbone of life on Earth because it can support an extraordinary variety of molecules for use in biology. Liquid water serves as the solvent in which the chemical reactions of life on Earth take place. Other factors required by life on Earth include elements such as nitrogen, which is used to make proteins and DNA, among many other molecules.
McKay noted that life could dominate exoplanets, and hence be detectable over interstellar distances. But that would only happen if that life is powered by light, he said. Still, life may not need much light in all cases; algae on Earth that live in the deep sea or under ice can survive on sunlight at levels less than one-100,000th of what Earth receives, a value that exceeds the amount of light that Pluto receives by about 100-fold.
Light can also provide energy for life in ways beyond photosynthesis. For instance, on worlds like Saturn's moon Titan, sunlight generates molecules such as acetylene and hydrogen gas in the atmosphere that could be used for energy in alien biology.
"Microorganisms on the surface of Titan would have these food sources just coming down from the sky, no need to bother with photosynthesis," McKay explains.
To investigate the limits of life on Earth, researchers look at extremophiles, organisms that have adapted to live in environmental extremes, such as extremes of heat, cold and radiation. The highest temperature at which scientists know life can live has increased significantly, from 176 degrees F (80 degrees C) to a whopping 251 degrees F (122 degrees C), or well above boiling. Recently, investigators also discovered microbes can live in temperatures as cold as 5 degrees F (minus 15 degrees C), or well below freezing.
McKay suggested that many potential limits to life, such as acidity, saltiness or ultraviolet radiation, are unlikely to be extreme enough to stifle life. He said the most important parameter for Earth-like life may be the presence of liquid water, but studies of life in extreme deserts show that even a small amount of rain, fog, snow and even simple humidity can help sustain life. Moreover, alien life may not even need liquid water; the liquid hydrocarbons on Titan, for example, might serve as the basis for life, playing the same role water does for life on Earth.
Newly published research signifies the reinterpretation of cold dark matter, opening up the possibility that it could be regarded as a very cold quantum fluid governing the formation of the structure of the Universe.
Tom Broadhurst, an Ikerbasque researcher at the UPV/EHU’s Department of Theoretical Physics, has participated alongside scientists of the National Taiwan University in a piece of research that explores cold dark matter in depth and proposes new answers about the formation of galaxies and the structure of the Universe. These predictions, published in the prestigious journal Nature Physics, are being contrasted with fresh data provided by the Hubble space telescope.
In cosmology, cold dark matter is a form of matter the particles of which move slowly in comparison with light, and interact weakly with electromagnetic radiation. It is estimated that only a minute fraction of the matter in the Universe is baryonic matter, which forms stars, planets and living organisms. The rest, comprising over 80%, is dark matter and energy.
The theory of cold dark matter helps to explain how the universe evolved from its initial state to the current distribution of galaxies and clusters, the structure of the Universe on a large scale. In any case, the theory was unable to satisfactorily explain certain observations, but the new research by Broadhurst and his colleagues sheds new light in this respect.
As the Ikerbasque researcher explained, “guided by the initial simulations of the formation of galaxies in this context, we have reinterpreted cold dark matter as a Bose-Einstein condensate”. So, “the ultra-light bosons forming the condensate share the same quantum wave function, so disturbance patterns are formed on astronomic scales in the form of large-scale waves”.
This theory can be used to suggest that all the galaxies in this context should have at their center large stationary waves of dark matter called solitons, which would explain the puzzling cores observed in common dwarf galaxies.
The research also makes it possible to predict that galaxies are formed relatively late in this context in comparison with the interpretation of standard particles of cold dark matter. The team is comparing these new predictions with observations by the Hubble space telescope.
The results are very promising as they open up the possibility that dark matter could be regarded as a very cold quantum fluid that governs the formation of the structure across the whole Universe. This research opened up fresh possibilities to conduct research into the first galaxies to emerge after the Big Bang.
Laser pulses have been made to accelerate themselves around loops of optical fibre- which seems to go against Newton’s 3rd law. This states that for every action there is an equal and opposite reaction. This new research exploits a loophole with light that makes it appear to have mass.
Under Newton’s third law of motion, if we imagine one billiard ball striking another upon a pool table, the two balls will bounce away from each other. If one of the billiard balls had a negative mass, then the collision of the two balls would result in them accelerating in the same direction. This effect could be used in a diametric drive, where negative and positive mass interact for a continuously propulsive effect. Such a drive also relies on the assumption that negative mass has negative inertia.
Quantum mechanics however states that matter cannot have a negative mass. Negative mass is not the same as antimatter, as even antimatter has positive mass. Negative mass is a hypothetical concept of matter where mass is of opposite sign to the mass of normal matter. Negative mass is used in speculative theories, such as the construction of wormholes. Should such matter exist, it would violate one or more energy conditions and show strange properties. No material object has ever been found that can be shown by experiment to have a negative mass.
Experimental physicist Ulf Peschel and his colleagues at the University of Erlangen-Nuremberg in Germany have now made a diametric drive using effective mass.. Photons travelling at the speed of light have no rest mass. Shining pulses of light into layered materials like crystals means some of the photons can be reflected backwards by one layer and forwards by another. This delays part of the pulse and interferes with the rest of the pulse as it passes more slowly through the material.
When a material such as layered crystals slows the speed of the light pulse in proportion to its energy, it is behaving as if it has mass. This is called effective mass, which is the mass that a particle appears to have when responding to forces. Light pulses can have a negative effective mass depending on the shape of their light waves and the structure of the crystal material that the light waves are passing through. To get a pulse to interact with material with a positive effective mass means finding a crystal that is so long that it can absorb the light before different pulses show a diametric drive effect.
Peschel therefore created a series of laser pulses in two loops of fibre-optic cable to get around these requirements. The pulses were split between the loops at a contact point and the light kept moving around each light in the same direction.
After several years of anticipation, the FDA has finally proposed a pair of guidelines for how drug and device makers should cope with some of the challenges and pitfalls posed by social media.
One of the so-called draft guidances offers instructions on how companies should attempt to correct product information on websites that are run by others, such as chat rooms. The other addresses how products – including risk and benefit information – can be discussed in venues such as Twitter, as well as paid search links on Google and Yahoo, all of which have limited space. This will involve using links to product web sites, for instances, that can be clicked.
“These are intended to have a beneficial impact on public health,” Tom Abrams, who heads the FDA Office of Prescription Drug Promotion, tells us. “But these were not developed in a vacuum. They were developed with careful consideration and with input from industry and many other stakeholders. There was a lot of important consideration given to the issues.”
For third-party websites, such as Wikipedia, the draft guidance suggests that companies should feel free to correct misinformation, but that any correction must include balanced information and the source of the revision or update must be noted, Abrams explains. This means a company or company employee or contractor should be credited with any additions.
“The information should not be promotional and should be factually correct. This is not an opportunity for a company to tout its drugs,” he says. “The information [being added or revised] should be consistent with the FDA-approved [product] labeling and for it to be effective, you want it posted right by the misinformation.”
The guidance also says that companies should contact writers, such as bloggers, to make changes when they learn of misinformation. Abrams notes companies will not be held responsible for those who do not make changes. If none of this is possible, he says companies should contact web site operators and suggest they delete the misinformation or open the site to comments so that corrections can be made.
The guidelines are being released nearly five years after the FDA held a well-attended public hearing to sift through Internet issues confronting drug and device makers. But the guidelines never materialized, despite repeated signals they may be forthcoming. Now, FDA officials must act before a July deadline set by a 2012 law requiring them to release guidance on product promotion on the Internet.