Amazing Science
802.6K views | +130 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Wireless brain sensor-transmitter could unchain neuroscience from cables

Wireless brain sensor-transmitter could unchain neuroscience from cables | Amazing Science |

In a study in the journal Neuron, scientists describe a new high data-rate, low-power wireless brain sensor. The technology is designed to enable neuroscience research that cannot be accomplished with current sensors that tether subjects with cabled connections. Experiments in the paper confirm that new capability. The results show that the technology transmitted rich, neuroscientifically meaningful signals from animal models as they slept and woke or exercised.

“We view this as a platform device for tapping into the richness of electrical signals from the brain among animal models where their neural circuit activity reflects entirely volitional and naturalistic behavior, not constrained to particular space,” said Arto Nurmikko, professor of engineering and physics affiliated with the Brown Institute for Brain Science and the paper’s senior and corresponding author. “This enables new types of neuroscience experiments with vast amounts of brain data wirelessly and continuously streamed from brain microcircuits.”

“The brain sensor is opening unprecedented opportunities for the development of neuroprosthetic treatments in natural and unconstrained environments,” said study co-author Grégoire Courtine, a professor at EPFL (École polytechnique fédérale de Lausanne), who collaborated with Nurmikko’s group on the research. To confirm the system performance, the researchers did a series of experiments with rhesus macaques, which walked on a treadmill while the researchers used the wireless system to measure neural signals associated with the brain’s motion commands. They also did another experiment in which animal subjects went through sleep/wake cycles, unencumbered by cables or wires; the data showed distinct patterns related to the different stages of consciousness and the transitions between them.

“We hope that the wireless neurosensor will change the canonical paradigm of neuroscience research, enabling scientists to explore the nervous system within its natural context and without the use of tethering cables,” said co-lead author David Borton. “Subjects are free to roam, forage, sleep, etc., all while the researchers are observing the brain activity. We are very excited to see how the neuroscience community leverages this platform.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Celestial impacts, mass extinctions and climate change in the search for life elsewhere

Celestial impacts, mass extinctions and climate change in the search for life elsewhere | Amazing Science |
Every so often our Earth encounters a large chunk of space debris which reminds us that our solar system still contains plenty of debris that could potentially have an impact on life on Earth.

While the great bulk of planetary accretion occurs in the first few hundred million years after the birth of a given system, the process never really comes to an end. Most of the objects that make up the tail of this accretion – grains of dust, lumps of ice, and pieces of rock – smash into our atmosphere and ablate harmlessly many kilometres above the ground, visible only as shooting stars. Larger impacts do, however, continue to occur – as illustrated on February 15, 2013, in the Russian city of Chelyabinsk. On that day, with no warning, a small near-Earth asteroid detonated in the atmosphere, and outshone the noon-day sun.

Though the object itself was relatively small, around 20m in diameter, it exploded with sufficient force to shatter windows many kilometres away, damaging more than 7,000 buildings. Amazingly, nobody was killed – but the impact served as a stark reminder of the dangers posed by rocks from space. The longer the timescale we consider, the larger the biggest collision the Earth might experience. A stand-out example is the impact, around 65 million years ago, thought to have contributed to the extinction of the dinosaurs.

Fortunately for us here on Earth, the rate at which such catastrophic impacts occur is relatively low, but this might not be the case in other planetary systems. Thanks to observations carried out at infrared wavelengths, we are now in a position to start categorising the small object populations of other planetary systems. As we do, we are finding that many systems contain far more debris, left over from their formation, than does our own. This gives us an additional tool by which we can assess potentially habitable planets. It should be possible to estimate the impact regimes that they might experience, based on these kind of observations.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Fibonacci quasiparticles could form basis of future topological quantum computers (TQC)

Fibonacci quasiparticles could form basis of future topological quantum computers (TQC) | Amazing Science |
Topological quantum computing (TQC) is a newer type of quantum computing that uses "braids" of particle tracks, rather than actual particles such as ions and electrons, as the qubits to implement computations. Using braids has one important advantage: it makes TQCs practically immune to the small perturbations in the environment that cause decoherence in particle-based qubits and often lead to high error rates.

Ever since TQC was first proposed in 1997, experimentally realizing the appropriate braids has been extremely difficult. For one thing, the braids are formed not by the trajectories of ordinary particles, but by the trajectories of exotic quasiparticles (particle-like excitations) called anyons. Also, movements of the anyons must be non-Abelian, a property similar to the non-commutative property in which changing the order of the anyons' movements changes their final tracks. In most proposals of TQC so far, the non-Abelian statistics of the anyons has not been powerful enough, even in theory, for universal TQC.

Now in a new study published in Physical Review Letters, physicists Abolhassan Vaezi at Cornell University and Maissam Barkeshli at Microsoft's research lab Station Q have theoretically shown that anyons tunneling in a double-layer system can transition to an exotic non-Abelian state that contains "Fibonacci" anyons that are powerful enough for universal TQC.

"Our work suggests that some existing experimental setups are rich enough to yield a phase capable of performing 'universal' TQC, i.e., all of the required logical gates for the performance of a quantum computer can be made through the braiding of anyons only," Vaezi told "Since braiding is a topological operation and does not perturb the low-energy physics, the resulting quantum computer is fault-tolerant."

Risto Suoknuuti's curator insight, March 29, 2015 6:25 PM

I like this idea of TQC. Changin tracks / states mimic computing. I think the idea of connectome is fruitfull here, a connectome with different type of passages that is a hypergraphic version of this theory.

Scooped by Dr. Stefan Gruenwald!

Reconstructing 3D neural tissue with biocompatible nanofiber scaffolds and hydrogels

Reconstructing 3D neural tissue with biocompatible nanofiber scaffolds and hydrogels | Amazing Science |

Damage to neural tissue is typically permanent and causes lasting disability in patients, but a new approach has recently been discovered that holds incredible potential to reconstruct neural tissue at high resolution in three dimensions. Research work recently published in the Journal of Neural Engineering demonstrated a method for embedding scaffolding of patterned nanofibers within three-dimensional (3D) hydrogel structures, and it was shown that neurite outgrowth from neurons in the hydrogel followed the nanofiber scaffolding by tracking directly along the nanofibers, particularly when the nanofibers were coated with a type of cell adhesion molecule called laminin. It was also shown that the coated nanofibers significantly enhanced the length of growing neurites, and that the type of hydrogel could significantly affect the extent to which the neurites tracked the nanofibers.

“Neural stem cells hold incredible potential for restoring damaged cells in the nervous system, and 3D reconstruction of neural tissue is essential for replicating the complex anatomical structure and function of the brain and spinal cord,” said Dr. McMurtrey, author of the study and director of the research institute that led this work. “So it was thought that the combination of induced neuronal cells with micropatterned biomaterials might enable unique advantages in 3D cultures, and this research showed that not only can neuronal cells be cultured in 3D conformations, but the direction and pattern of neurite outgrowth can be guided and controlled using relatively simple combinations of structural cues and biochemical signaling factors.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

NASA: Measuring the Dramatic Elevation Changes on the Greenland Ice Sheet

NASA: Measuring the Dramatic Elevation Changes on the Greenland Ice Sheet | Amazing Science |

Since the late 1970's, NASA has been monitoring changes in the Greenland Ice Sheet. Recent analysis of seven years of surface elevation readings from NASA's ICESat satellite and four years of laser and and ice-penetrating radar data from NASA's airborne mission Operation IceBridge shows us how the surface elevation of the ice sheet has changed.

The colors shown on the surface of the ice sheet represent the accumulated change in elevation since 2003. The light yellow over the central region of the ice sheet indicates a slight thickening due to snow. This accumulation, along with the weight of the ice sheet, pushes ice toward the coast. Thinning near coastal regions, shown in green, blue and purple, has increased over time and now extends into the interior of the ice sheet where the bedrock topography permits. As a result, there has been an average loss of 300 cubic kilometers of ice per year between 2003 and 2012.

This animation portrays the changes occurring in the surface elevation of the ice sheet since 2003 in three drainage regions: the southeast, the northeast and the Jakobshavn regions. In each region, the time advances to show the accumulated change in elevation from 2003 through 2012.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Colliding Galaxies Are Producing A Spectacular Light Show

Colliding Galaxies Are Producing A Spectacular Light Show | Amazing Science |

Some 130 million light years away, within the constellation Canis Major, two twinkling spiral galaxies are in the process of colliding, treating our eyes to a dazzling show. The pair—NGC 2207 and IC 2163—has hosted the explosive deaths of three stars as supernovas in the last 15 years, and is also spawning stars at an intense rate. But scientists have become particularly interested in these merging galaxies for another reason: They are home to one of the largest known collections of super bright X-ray objects. These so-called “ultraluminous X-ray sources” have been spotted using NASA’s Chandra X-ray Observatory, which is partly responsible for the lustrous composite image shown above.

Just like the Milky Way, these galaxies are teeming with bright sources of X-rays called X-ray binaries. These are systems in which a normal star is closely orbiting a collapsed star, such as a neutron star or black hole. Strong gravitational forces from the compact star draw material from the normal star, a process known as accretion. And when the material hits the companion star, it is heated to millions of degrees, a process that generates a huge amount of X-rays. While intense, the emission from these systems pales in comparison to that from ultraluminous X-ray sources (ULXs).

As the name suggests, ULXs are exceedingly bright; they emit more radiation in the X-rays than a million suns would at all wavelengths. Although they are not well understood, many believe they could be black holes of approximately 10 solar masses that are collecting, or accreting, material onto a disk, emitting X-rays in an intense beam. ULXs are also extremely rare; our galaxy doesn’t have one, most galaxies don’t, but those that do usually only have one. Between NGC 2207 and IC 2163, however, there are 28.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Plastic Pollution: More than 5 Trillion Plastic Pieces Weighing over 250,000 Tons Afloat at Sea

Plastic Pollution: More than 5 Trillion Plastic Pieces Weighing over 250,000 Tons Afloat at Sea | Amazing Science |

Nearly 269,000 tons of plastic pollution may be floating in the world's oceans, according to a new study. Microplastic pollution is found in varying concentrations throughout the oceans, but estimates of the global abundance and weight of floating plastics, both micro and macroplastic, lack sufficient data to support them. To better estimate the total number of plastic particles and their weight floating in the world's oceans, scientists from six countries contributed data from 24 expeditions collected over a six-year period from 2007-2013 across all five sub-tropical gyres, coastal Australia, Bay of Bengal, and the Mediterranean Sea.

Microplastic pollution is found in varying concentrations throughout the oceans, but estimates of the global abundance and weight of floating plastics, both micro and macroplastic, lack sufficient data to support them. To better estimate the total number of plastic particles and their weight floating in the world's oceans, scientists from six countries contributed data from 24 expeditions collected over a six-year period from 2007-2013 across all five sub-tropical gyres, coastal Australia, Bay of Bengal, and the Mediterranean Sea. The data included information about microplastics collected using nets and large plastic debris from visual surveys, which were then used to calibrate an ocean model of plastic distribution.

Based on the data and model, the authors of the study estimate a minimum of 5.25 trillion plastic particles weighing nearly 269,000 tons in the world's oceans. Large plastics appear to be abundant near coastlines, degrading into microplastics in the 5 subtropical gyres, and that the smallest microplastics were present in more remote regions, such as the subpolar gyres, which the authors did not expect. The distribution of the smallest microplastics in remote regions of the ocean may suggest that gyres act as 'shredders' of large plastic items into microplastics, after which they eject them across the ocean.

"Our findings show that the garbage patches in the middle of the five subtropical gyres are not the final resting places for the world's floating plastic trash. The endgame for micro-plastic is interactions with entire ocean ecosystems," says Marcus Eriksen, PhD, Director of Research for the 5 Gyres Institute.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A Worm's Mind In A Lego Body: Scientists Map Brain Connectome of C.elegans and Upload it to a Lego Robot

A Worm's Mind In A Lego Body: Scientists Map Brain Connectome of C.elegans and Upload it to a Lego Robot | Amazing Science |

Take the connectome of a worm and transplant it as software in a Lego Mindstorms EV3 robot - what happens next? It is a deep and long standing philosophical question. Are we just the sum of our neural networks. Of course, if you work in AI you take the answer mostly for granted, but until someone builds a human brain and switches it on we really don't have a concrete example of the principle in action.

The nematode worm Caenorhabditis elegans (C. elegans) is tiny and only has 302 neurons. These have been completely mapped and the OpenWorm project is working to build a complete simulation of the worm in software. One of the founders of the OpenWorm project, Timothy Busbice, has taken the connectome and implemented an object oriented neuron program.

The model is accurate in its connections and makes use of UDP packets to fire neurons. If two neurons have three synaptic connections then when the first neuron fires a UDP packet is sent to the second neuron with the payload "3". The neurons are addressed by IP and port number. The system uses an integrate and fire algorithm. Each neuron sums the weights and fires if it exceeds a threshold. The accumulator is zeroed if no message arrives in a 200ms window or if the neuron fires. This is similar to what happens in the real neural network, but not exact.

The software works with sensors and effectors provided by a simple LEGO robot. The sensors are sampled every 100ms. For example, the sonar sensor on the robot is wired as the worm's nose. If anything comes within 20cm of the "nose" then UDP packets are sent to the sensory neurons in the network.

The same idea is applied to the 95 motor neurons but these are mapped from the two rows of muscles on the left and right to the left and right motors on the robot. The motor signals are accumulated and applied to control the speed of each motor.  The motor neurons can be excitatory or inhibitory and positive and negative weights are used. 

And the result? It is claimed that the robot behaved in ways that are similar to observed C. elegans. Stimulation of the nose stopped forward motion. Touching the anterior and posterior touch sensors made the robot move forward and back accordingly. Stimulating the food sensor made the robot move forward.

More Information: The Robotic Worm (Biocoder pdf - free on registration)
No comment yet.
Scooped by Dr. Stefan Gruenwald!

New high precision method to measure the distance of galaxies

New high precision method to measure the distance of galaxies | Amazing Science |

All big galaxies in the Universe host a supermassive black hole in their center and in about a tenth of all galaxies, these supermassive black holes are growing by swallowing huge amounts of gas and dust from their surrounding environments. In this process, the material heats up and becomes very bright – becoming the most energetic sources of emission in the Universe known as active galactic nuclei.

The hot dust forms a ring around the supermassive black hole and emits infrared radiation, which the scientists used as the ruler.

By combining the light from the two 10-m Keck telescopes on Mauna Kea on Hawaii using a method called interferometry, the scientists achieved an effective resolution equivalent to a telescope with a perfect 85-meter diameter mirror. This provided very high resolution – a hundred times better resolution than the Hubble Space Telescope – and allowed them to measure the angular size of the dust ring on the sky.

By combining the physical size of 30 light-days with the apparent size measured with the data from the Keck interferometer, the astronomers were able to determine the distance to NGC 4151. “We calculated the distance to be 62 million light-years,” said Dr Darach Watson of the University of Copenhagen’s Niels Bohr Institute, who is a co-author of the paper published in the journal Nature.

“The previous calculations based on redshift were between 13 million and 95 million light-years, so we have gone from a great deal of uncertainty to now being able to determine the precise distance. This is very significant for astronomical calculations of cosmic scale distances.” “Such distances are key in pinning down the cosmological parameters that characterize our Universe or for accurately measuring black hole masses,” Dr Hoenig added.

“Indeed, NGC 4151 is a crucial anchor to calibrate various techniques to estimate black hole masses. Our new distance implies that these masses may have been systematically underestimated by 40 per cent.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Krubera, Earth's deepest cave mapped - it takes 1 month to decent to the bottom

Krubera, Earth's deepest cave mapped - it takes 1 month to decent to the bottom | Amazing Science |

The Krubera cave is located in the Arabika Massif mountain range on the edge of the Black Sea in Abkhazia, which some argue is part of Georgia.  It is said to be bottomless, but experts have managed to map Earth’s deepest cave. Intrepid explorers have charted every known twist and turn of the terrifying Krubera cave that measures 7,208ft (2,197meters) deep. And with every expedition the chasm seems to become deeper as divers plunge to new depths never visited by humans to extend the cave’s reach into the Earth.

The cave is called Voronya in Russia, which means crow's cave. The name was used as slang by Kiev cavers during the 1980s because of the number of crows nesting in the entrance pit. The Arabika Massif is one of the largest high-mountain limestone karst massifs (the main mass of an exposed structure) in the Western Caucasus, which is an area of southern Russia.  It is composed of Lower Cretaceous and Upper Jurassic limestones that dip continuously southwest to the Black Sea and plunge below the modern sea level. The cave, which is named after Russian geologist Alexander Krubera, is the only chasm on Earth that's known to be deeper than 6,561ft (2,000m).

In 2005 he organized a series of expeditions and his team of 56 carried some five tons of equipment into the chasm. Much like scaling a mountain, the team had to cover certain distances so they could set up camp at depth of 2,300, 3,986, 4,630, and 5,380ft (700, 1,215, 1,410, and 1,640metres). The explorers were able to cook meals, sleep in tents and huddle together for warmth before venturing down the limestone rock faces for up to 20 hours at a time, sometimes though extremely cold water. It takes about 1 month to climb down to the bottom.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The search continues for dark photons, hypothetical messengers of an invisible universe hidden from us

The search continues for dark photons, hypothetical messengers of an invisible universe hidden from us | Amazing Science |

The matter we know accounts for less than 5 percent of the universe; the rest is filled with invisible dark matter and dark energy. Scientists working on a new experiment to be conducted at Thomas Jefferson National Accelerator Facility in Virginia hope to shed light on some of those cosmic unknowns. According to certain theories known as hidden-sector models, dark matter is thought to consist of particles that interact with regular matter through gravitation, which is why we know about it, but not through the electromagnetic, strong and weak fundamental forces (which is why it is hard to detect). Such dark matter would interact with regular matter and with itself through yet-to-be-discovered hidden-sector forces. Scientists believe that heavy photons—also called dark photons—might be mediators of such a dark force, just as regular photons are carriers of the electromagnetic force between normal charged particles.

The Heavy Photon Search at Jefferson Lab will hunt for these dark, more massive cousins of light.

“The heavy photon could be the key to a whole rich world with many new dark particles and forces,” says Rouven Essig, a Stony Brook University theoretical physicist who in recent years helped develop the theory for heavy-photon searches.  If heavy photons exist, researchers want to create them in the lab.

Theoretically, a heavy photon can transform into what is known as a virtual photon—a short-lived fluctuation of electromagnetic energy with mass—and vice versa. This should happen only very rarely and for a very short time, but it still means that experiments that produce virtual photons could in principle also generate heavy photons. Producing enormous numbers of virtual photons may create detectable amounts of heavy ones.

At Jefferson Lab’s Continuous Electron Beam Accelerator Facility, CEBAF, scientists will catapult electrons into a tungsten target, which will generate large numbers of virtual photons—and perhaps some heavy photons, too. The photon mass measured in the experiment matters because a heavy photon has a unique mass, whereas virtual photons appear with a broad range of masses. “The heavy photon would reveal itself as a sharp bump on top of a smooth background from the virtual photon decays,” says SLAC National Accelerator Laboratory’s John Jaros, another HPS spokesperson.

The location in which the electron-positron pair was produced also matters because virtual photons decay almost instantaneously within the target, says Timothy Nelson, project lead for the silicon detector, which is being built at SLAC. Heavy photons could decay more slowly, after traveling beyond the target. So photons that decay outside the target can only be heavy ones. The HPS silicon detector’s unique ability to identify outside-of-target decays sets it apart from other experiments currently participating in a worldwide hunt for heavy photons.

The HPS calorimeter, whose construction was led by researchers from the French Institut de Physique Nucléaire, the Italian Istituto Nazionale di Fisica Nucleare and Jefferson Lab, is currently being tested at Jefferson Lab, while scientists at SLAC plan to ship their detector early next year. The experiment is scheduled to begin in the spring of 2015.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Tooth loss in birds occurred about 116 million years ago

Tooth loss in birds occurred about 116 million years ago | Amazing Science |

The absence of teeth or "edentulism" has evolved on multiple occasions within vertebrates including birds, turtles, and a few groups of mammals such as anteaters, baleen whales and pangolins. Where early birds are concerned, the fossil record is fragmentary. A question that has intrigued biologists is: Based on this fossil record, were teeth lost in the common ancestor of all living birds or convergently in two or more independent lineages of birds? A research team led by biologists at the University of California, Riverside and Montclair State University, NJ, has found an answer. Using the degraded remnants of tooth genes in birds to determine when birds lost their teeth, the team reports in the Dec. 12 issue ofScience that teeth were lost in the common ancestor of all living birds more than 100 million years ago.

"One of the larger lessons of our finding is that 'dead genes,' like the remnants of dead organisms that are preserved in the fossil record, have a story to tell," said Mark Springer, a professor of biology and one of the lead authors of the study along with Robert Meredith at Montclair State University who was previously a graduate student and postdoctoral researcher in Springer's laboratory. "DNA from the crypt is a powerful tool for unlocking secrets of evolutionary history."

Springer explained that edentulism and the presence of a horny beak are hallmark features of modern birds. "Ever since the discovery of the fossil bird Archaeopteryx in 1861, it has been clear that living birds are descended from toothed ancestors," he said. "However, the history of tooth loss in the ancestry of modern birds has remained elusive for more than 150 years."

All toothless/enamelless vertebrates are descended from an ancestor with enamel-capped teeth. In the case of birds, it is theropod dinosaurs. Modern birds use a horny beak instead of teeth, and part of their digestive tract to grind up and process food.

Tooth formation in vertebrates is a complicated process that involves many different genes. Of these genes, six are essential for the proper formation of dentin (DSPP) and enamel (AMTN, AMBN, ENAM, AMELX, MMP20).

The researchers examined these six genes in the genomes of 48 bird species, which represent nearly all living bird orders, for the presence of inactivating mutations that are shared by all 48 birds. The presence of such shared mutations in dentin and enamel-related genes would suggest a single loss of mineralized teeth in the common ancestor of all living birds.

Springer, Meredith, and other members of their team found that the 48 bird species share inactivating mutations in both dentin-related (DSPP) and enamel-related genes (ENAMAMELX, AMTNMMP20), indicating that the genetic machinery necessary for tooth formation was lost in the common ancestor of all modern birds.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The fastest camera ever created records events at 100 billion frames per second

The fastest camera ever created records events at 100 billion frames per second | Amazing Science |
You can watch exactly how light interacts with objects using a camera that captures 100 billion frames per second.

The capture of transient scenes at high imaging speed has been long sought by photographers1-4, with early examples being the well known recording in 1878 of a horse in motion5 and the 1887 photograph of a supersonic bullet6. However, not until the late twentieth century were breakthroughs achieved in demonstrating ultrahigh-speed imaging (more than 105 frames per second)7. In particular, the introduction of electronic imaging sensors based on the charge-coupled device (CCD) or complementary metal–oxide–semiconductor (CMOS) technology revolutionized high-speed photography, enabling acquisition rates of up to 107 frames per second8. Despite these sensors’ widespread impact, further increasing frame rates using CCD or CMOS technology is fundamentally limited by their on-chip storage and electronic readout speed9. A team of scientists now demonstrate a two-dimensional dynamic imaging technique, compressed ultrafast photography (CUP), which can capture non-repetitive time-evolving events at up to 1011frames per second. Compared with existing ultrafast imaging techniques, CUP has the prominent advantage of measuring an xyt (xy, spatial coordinates; t, time) scene with a single camera snapshot, thereby allowing observation of transient events with temporal resolution as tens of picoseconds. Furthermore, akin to traditional photography, CUP is receive-only, and so does not need the specialized active illumination required by other single-shot ultrafast imagers23. As a result, CUP can image a variety of luminescent—such as fluorescent or bioluminescent—objects. Using CUP, we visualize four fundamental physical phenomena with single laser shots only: laser pulse reflection and refraction, photon racing in two media, and faster-than-light propagation of non-information (that is, motion that appears faster than the speed of light but cannot convey information). Given CUP’s capability, the researchers expect it to find widespread applications in both fundamental and applied sciences, including biomedical research.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New Low-Cost Lithography Technique of Creating 3-D Nanostructures

New Low-Cost Lithography Technique of Creating 3-D Nanostructures | Amazing Science |

Researchers from North Carolina State University have developed a new lithography technique that uses nanoscale spheres to create three-dimensional (3-D) structures with biomedical, electronic and photonic applications. The new technique is significantly less expensive than conventional methods and does not rely on stacking two-dimensional (2-D) patterns to create 3-D structures.

“Our approach reduces the cost of nanolithography to the point where it could be done in your garage,” says Dr. Chih-Hao Chang, an assistant professor of mechanical and aerospace engineering at NC State and senior author of a paper on the work.

Most conventional lithography uses a variety of techniques to focus light on a photosensitive film to create 2-D patterns. These techniques rely on specialized lenses, electron beams or lasers – all of which are extremely expensive. Other conventional techniques use mechanical probes, which are also costly. To create 3-D structures, the 2-D patterns are essentially printed on top of each other. The NC State researchers took a different approach, placing nanoscale polystyrene spheres on the surface of the photosensitive film.

The nanospheres are transparent, but bend and scatter the light that passes through them in predictable ways according to the angle that the light takes when it hits the nanosphere. The researchers control the nanolithography by altering the size of the nanosphere, the duration of light exposures, and the angle, wavelength and polarization of light. The researchers can also use one beam of light, or multiple beams of light, allowing them to create a wide variety of nanostructure designs.

“We are using the nanosphere to shape the pattern of light, which gives us the ability to shape the resulting nanostructure in three dimensions without using the expensive equipment required by conventional techniques,” Chang says. “And it allows us to create 3-D structures all at once, without having to make layer after layer of 2-D patterns.”

The researchers have also shown that they can get the nanospheres to self-assemble in a regularly-spaced array, which in turn can be used to create a uniform pattern of 3-D nanostructures.

“This could be used to create an array of nanoneedles for use in drug delivery or other applications,” says Xu Zhang, a Ph.D. student in Chang’s lab and lead author of the paper.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Four-layer prototype of high-rise computer chip built by Stanford engineers

Four-layer prototype of high-rise computer chip built by Stanford engineers | Amazing Science |

For decades, the mantra of electronics has been smaller, faster, cheaper. Today, Stanford engineers add a fourth word - taller.  At a conference in San Francisco, a Stanford team will reveal how to build high-rise chips that could leapfrog the performance of the single-story logic and memory chips on today's circuit cards.

Those circuit cards are like busy cities in which logic chips compute and memory chips store data. But when the computer gets busy, the wires connecting logic and memory can get jammed. The Stanford approach would end these jams by building layers of logic atop layers of memory to create a tightly interconnected high-rise chip. Many thousands of nanoscale electronic "elevators" would move data between the layers much faster, using less electricity, than the bottle-neck prone wires connecting single-story logic and memory chips today.

The work is led by Subhasish Mitra, a Stanford professor of electrical engineering and computer science, and H.-S. Philip Wong, the Williard R. and Inez Kerr Bell Professor in Stanford's School of Engineering. They describe their new high-rise chip architecture in a paper being presented at the IEEE International Electron Devices Meeting on Dec. 15-17. The researchers' innovation leverages three breakthroughs.

The first is a new technology for creating transistors, those tiny gates that switch electricity on and off to create digital zeroes and ones. The second is a new type of computer memory that lends itself to multi-story fabrication. The third is a technique to build these new logic and memory technologies into high-rise structures in a radically different way than previous efforts to stack chips.

"This research is at an early stage, but our design and fabrication techniques are scalable," Mitra said. "With further development this architecture could lead to computing performance that is much, much greater than anything available today."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Physics professor publishes exact solution to model Big Bang and quark gluon plasma

Physics professor publishes exact solution to model Big Bang and quark gluon plasma | Amazing Science |
Unlike in mathematics, it is rare to have exact solutions to physics problems.

"When they do present themselves, they are an opportunity to test the approximation schemes (algorithms) that are used to make progress in modern physics," said Michael Strickland, Ph.D., associate professor of physics at Kent State University. Strickland and four of his collaborators recently published an exact solution in the journal Physical Review Letters that applies to a wide array of physics contexts and will help researchers to better model galactic structure, supernova explosions and high-energy particle collisions, such as those studied at the Large Hadron Collider at CERN in Switzerland. In these collisions, experimentalists create a short-lived high-temperature plasma of quarks and gluons called quark gluon plasma (QGP), much like what is believed to be the state of the universe milliseconds after the Big Bang 13.8 billion years ago.

In their article, Strickland and co-authors Gabriel S. Denicol of McGill University, Ulrich Heinz and Mauricio Martinez of the Ohio State University, and Jorge Noronha of the University of São Paulo presented the first exact solution that describes a system that is expanding at relativistic velocities radially and longitudinally.

The equation that was solved was invented by Austrian physicist Ludwig Boltzmann in 1872 to model the dynamics of fluids and gases. This equation was ahead of its time since Boltzmann imagined that matter was atomic in nature and that the dynamics of the system could be understood solely by analyzing collisional processes between sets of particles.

"In the last decade, there has been a lot of work modeling the evolution of the quark gluon plasma using hydrodynamics in which the QGP is imagined to be fluidlike," Strickland said. "As it turns out, the equations of hydrodynamics can be obtained from the Boltzmann equation and, unlike the hydrodynamical equations, the Boltzmann equation is not limited to the case of a system that is in (or close to) thermal equilibrium.

"Both types of expansion occur in relativistic heavy ion collisions, and one must include both if one hopes to make a realistic description of the dynamics," Strickland continued. "The new exact solution has both types of expansion and can be used to tell us which hydrodynamical framework is the best."

The abstract for this article can be found at… ysRevLett.113.202301.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Life forms uncovered by the deepest-ever marine drilling expedition have been analyzed by scientists

Life forms uncovered by the deepest-ever marine drilling expedition have been analyzed by scientists | Amazing Science |

The International Ocean Discovery Program (IODP) found microbes living 2,400m beneath the seabed off Japan. The tiny, single-celled organisms survive in this harsh environment on a low-calorie diet of hydrocarbon compounds and have a very slow metabolism. The findings are being presented at the America Geophysical Union Fall Meeting.

Elizabeth Trembath-Reichert, from the California Institute of Technology, who is part of the team that carried out the research, said: "We keep looking for life, and we keep finding it, and it keeps surprising us as to what it appears to be capable of." The IODP Expedition 337 took place in 2012 off the coast of Japan’s Shimokita Peninsula in the northwestern Pacific. From the Chikyu ship, a monster drill was set down more than 1,000m (3,000ft) beneath the waves, where it penetrated a record-breaking 2,446m (8,024ft) of rock under the seafloor. Samples were taken from the ancient coal bed system that lies at this depth, and were returned to the ship for analysis.

The team found that microbes, despite having no light, no oxygen, barely any water and very limited nutrients, thrived in the cores. To find out more about how this life from the "deep biosphere" survives, the researchers set up a series of experiments in which they fed the little, spherical organisms different compounds. Dr Trembath-Reichert said: "We chose these coal beds because we knew there was carbon, and we knew that this carbon was about as tasty to eat, when it comes to coal, as you could get for microbes. "The thought was that while there are some microbes that can eat compounds in coal directly, there may be smaller organic compounds – methane and other types of hydrocarbons - sourced from the coal that the microbes could eat as well."

The experiments revealed that the microbes were indeed dining on these methyl compounds. The tests also showed that the organisms lived life in the slow lane, with an extremely sluggish metabolism.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

MIT discovers new mathematical law for superconductors

MIT discovers new mathematical law for superconductors | Amazing Science |
MIT researchers have discovered a new mathematical relationship — between material thickness, temperature, and electrical resistance — that appears to hold in all superconductors. They describe their findings in the latest issue of Physical Review B.

The result could shed light on the nature of superconductivity and could also lead to better-engineered superconducting circuits for applications like quantum computing and ultralow-power computing.

“We were able to use this knowledge to make larger-area devices, which were not really possible to do previously, and the yield of the devices increased significantly,” says Yachin Ivry, a postdoc in MIT’s Research Laboratory of Electronics, and the first author on the paper. Ivry works in the Quantum Nanostructures and Nanofabrication Group, which is led by Karl Berggren, a professor of electrical engineering and one of Ivry’s co-authors on the paper. Among other things, the group studies thin films of superconductors.

Superconductors are materials that, at temperatures near absolute zero, exhibit no electrical resistance; this means that it takes very little energy to induce an electrical current in them. A single photon will do the trick, which is why they’re useful as quantum photodetectors. And a computer chip built from superconducting circuits would, in principle, consume about one-hundredth as much energy as a conventional chip.

“Thin films are interesting scientifically because they allow you to get closer to what we call the superconducting-to-insulating transition,” Ivry says. “Superconductivity is a phenomenon that relies on the collective behavior of the electrons. So if you go to smaller and smaller dimensions, you get to the onset of the collective behavior.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

World Energy Outlook: In 2040, Fossil Fuels Will Still Reign

World Energy Outlook: In 2040, Fossil Fuels Will Still Reign | Amazing Science |

By 2040, the world’s energy supply mix will be divided into nearly four equal parts; Oil, gas, coal and low-carbon sources—nuclear and renewables—according to the International Energy Agency’s (IEA) 2014 World Energy OutlookThe assessment by the IEA finds that under current and planned policies, the average temperature will also increase by 3.6 degrees Celsius by 2100. Renewable energy takes a far greater role in new electricity supply in the near future—expanding from about 1700 gigawatts today to 4550 gigawatts in 2040—but it is not enough to offset the global dominance of fossil fuels.

“As our global energy system grows and transforms, signs of stress continue to emerge,” IEA Executive Director Maria van der Hoeven, said in a statement. “But renewables are expected to go from strength to strength, and it is incredible that we can now see a point where they become the world’s number one source of electricity generation.”

Renewable energy production will double as a share of world electricity demand by 2040, according to the report. But that still does not dethrone coal in electricity generation. Coal will simply shift regionally from the United States and China to Southeast Asia and India, according to the EIA.

The least attractive piece of all, energy efficiency, is poised to be a winner in coming decades and could have an even greater impact if some of the world’s largest energy users carry through with proposed efficiency plans. Efficiency measures are set to halve the global growth in energy demand from 2 percent annually to about 1 percent beginning in 2025, according to the IEA.

Efficiency standards for cars and more stringent energy efficiency targets for industry and everyday devices are key to slowing the demand for energy, but they do not necessarily help diminish the world’ reliance on fossil fuels because the true price of fossil fuels are not acurately reflected in the price people pay in some regions.

Fossil fuels receive about $550 billion in subsidies in 2013, compared to $120 billion for all renewable energies. Although the fossil fuel subsidies were $25 billion lower than 2012, there is still vast room for improvement to end price breaks for the mature industries, especially in gas and oil-rich nations, which offer the bulk of the subsidies.

pdeppisch's comment, December 15, 2014 4:20 PM
Except that the world will not be recognizable in 2040!
J. Steven Sprenger ✔'s curator insight, December 16, 2014 4:07 PM

Disruptive technologies, as the article points out, could be the game changer that could change these projections. 

Scooped by Dr. Stefan Gruenwald!

Humans Began Using Fire 350000 Years Ago, Israeli Cave Study Reveals

Humans Began Using Fire 350000 Years Ago, Israeli Cave Study Reveals | Amazing Science |

When early humans discovered how to purposefully create fire and make the most of it for their survival, it was a feat comparable to modern day milestones of sending men to the moon, but while the mastery of fire is hailed as among the most crucial developments in human history and evolution, researchers are not certain when this happened.

Some anthropologists believe that early humans started to exploit fire as early as 1.5 million years ago, but much of the evidence supporting this claim such as the heated clays and charcoal fragments is disputed because they can be attributed to natural bush fires. Many experts also think that the early uses of fire were opportunistic, meaning early humans used natural bush fires instead of starting the fire themselves.

A group of archeologists studying artifacts from an ancient cave, however, claims to have figured out when humans learned to master fire. For their study published in the journal Science on Oct. 19, Ron Shimelmitz, from the Zinman Institute of Archaeology of the University of Haifa in Israel, and colleagues examined artifact, most of which were flint tools and debris excavated from Israel's Tabun Cave. The archeological site, which was declared as having universal value by UNESCO two years ago, documents half a million years of human history and provided the researchers with the opportunity to study how the use of fire evolved in the cave.

By examining the cave's sediment layers, the researchers found that most of the flints were not burned in layers that were older than 350,000 years old. Burned-up flints, however, started to show up more regularly after this with most of the flints characterized by cracking, red or black coloration, and small round depressions where fragments called pot lids flaked off the stone, indicating exposure to fire.

Shimelmitz and colleagues said that while fire had been in use for a long time, it took a while before humans learned how to control and start it with the study indicating that habitual use of fire in Israel's Tabun Cave started just between 350,000-320,000 years ago. "While hominins may have used fire occasionally, perhaps opportunistically, for some million years, we argue here that it only became a consistent element in behavioral adaptations during the second part of the Middle Pleistocene," the researchers wrote.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New treatment technique applying magnetic pulses to ferromagnetic nanorods to deliver drugs deep into body

New treatment technique applying magnetic pulses to ferromagnetic nanorods to deliver drugs deep into body | Amazing Science |

A new technique to magnetically deliver drug-carrying nanorods to deep targets in the body using fast-pulsed magnetic fields could transform the way deep-tissue tumors and other diseases are treated, say researchers at the University of Maryland (UMD) and Bethesda-based Weinberg Medical Physics LLC (WMP).

Instead of surgery or systemically administered treatments (such as chemotherapy), the use of magnetic nanoparticles as drug carriers could potentially allow clinicians to use external magnets to focus therapy to the precise locations of a disease within a patient, such as inoperable deep tumors or sections of the brain that have been damaged by trauma, vascular, or degenerative diseases.

So for years, researchers have worked with magnetic nanoparticles loaded with drugs or genes to develop noninvasive techniques to direct therapies and diagnostics to targets in the body. However, due to the physics of magnetic forces, particles otherwise unaided could only be attracted to a magnet, not concentrated into points distant from the magnet face. So in clinical trials, magnets held outside the body have only been able to concentrate treatment to targets at or just below the skin surface, the researchers say.

“What we have shown experimentally is that by exploiting the physics of nanorods we can use fast-pulsed magnetic fields to focus the particles to a deep target between the magnets,” said UMD Institute for Systems Research Professor Benjamin Shapiro. 

These pulsed magnetic fields allowed the team to reverse the usual behavior of magnetic nanoparticles. Instead of a magnet attracting the particles, they showed that an initial magnetic pulse can orient the rod-shaped particles without pulling them, and then a subsequent pulse can push the particles before the particles can reorient. By repeating the pulses in sequence, the particles were focused to locations between the electromagnets. The study, published last week in Nano Letters, shows that using this method, ferromagnetic nanorods carrying drugs or molecules could be concentrated to arbitrary deep locations between magnets.

The researchers are now working to demonstrate this method in vivo to prove its therapeutic potential and have launched IronFocus Medical, Inc., a startup company established to commercialize their invention.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Organic electronic sensors can be stuck on the skin like a Band-Aid

Organic electronic sensors can be stuck on the skin like a Band-Aid | Amazing Science |

“There are various pulse oximeters already on the market that measure pulse rate and blood-oxygen saturation levels, but those devices use rigid conventional electronics, and they are usually fixed to the fingers or earlobe,” said Ana Arias, an associate professor of electrical engineering and computer sciences and head of the UC Berkeley team that is developing a new organic optoelectronic sensor.

By switching from silicon to an organic, or carbon-based, design, the researchers were able to create a device that could ultimately be thin, cheap and flexible enough to be slapped on like a Band-Aid during that jog around the track or hike up the hill. The engineers put the new prototype up against a conventional pulse oximeter and found that the pulse and oxygen readings were just as accurate.

A conventional pulse oximeter typically uses light-emitting diodes (LEDs) to send red and infrared light through a fingertip or earlobe. Sensors detect how much light makes it through to the other side. Bright, oxygen-rich blood absorbs more infrared light, while the darker hues of oxygen-poor blood absorb more red light. The ratio of the two wavelengths reveals how much oxygen is in the blood. For the organic sensors, Arias and her team of graduate students – Claire Lochner, Yasser Khan and Adrien Pierre – used red and green light, which yield comparable differences to red and infrared when it comes to distinguishing high and low levels of oxygen in the blood.

Using a solution-based processing system, the researchers deposited the green and red organic LEDs and the translucent light detectors onto a flexible piece of plastic. By detecting the pattern of fresh arterial blood flow, the device can calculate a pulse.

“We showed that if you take measurements with different wavelengths, it works, and if you use unconventional semiconductors, it works,” said Arias. “Because organic electronics are flexible, they can easily conform to the body.” Arias added that because the components of conventional oximeters are relatively expensive, healthcare providers will choose to disinfect them if they become contaminated. In contrast, “organic electronics are cheap enough that they are disposable like a Band-Aid after use,” she said.

Scooped by Dr. Stefan Gruenwald!

Discovering the Undiscovered: The time is right to apply genomic technologies to discover new life on Earth

Discovering the Undiscovered: The time is right to apply genomic technologies to discover new life on Earth | Amazing Science |

In a perspective piece published November 6 in the journal Science,Eddy Rubin, Director of the U.S. Department of Energy Joint Genome Institute (DOE JGI), a DOE Office of Science User Facility, along with Microbial Program Head Tanja Woyke, discusses why the time is right to apply genomic technologies to discover new life on Earth. In this perspective they propose the division of microbial life on Earth into three categories: explored, unexplored, and undiscovered. The first can be grown in the laboratory. The second encompasses the uncultivated organisms from environmental samples known only by their molecular signatures. The third, the focus of the perspective, is the yet-undiscovered life that up until now has eluded detection.

“We are poised, armed with a new toolkit of powerful genomic technologies to generate and mine the increasingly large datasets to discover new life that may be strikingly different from those that we catalogued thus far,” said Rubin. “Nature has been tinkering with life for at least three billion years and we now have a new set of ways to look for novel life that have so far eluded discovery.”

“Massive-scale metagenomic sequencing of environmental DNA and RNA samples should, in principle, generate sequence data from any entity for which nucleic acids can be extracted,” Rubin noted. “Analysis of these data to identify outliers to previously defined life represents a powerful means to explore the unknown.”

In addition, Rubin pointed to the advent of single-cell sequencing with microfluidic and cell sorting approaches, focused specifically on cells that lack genes that match previously identified ones, as another approach in the search for completely novel organisms.

“We also need to choose particularly suitable environmental niches so that we are not just looking, ‘under the street lamp’ — at environments that we have already previously studied.”

Rubin suggested targets for the discovery of novel life including extreme, inhospitable and isolated environments that are expected to be preferred niches for early life, potentially sheltered from more modern microbial competitors. This would include low oxygen subsurface sites with environmental conditions predating the Great Oxidation Event that occurred about 2.3 billion years ago when the atmosphere went from very low to high oxygen concentrations. Support for the idea that isolated low-oxygen environments may be preferred niches for early life comes from observations that anaerobic niches deep within Earth’s crust tend to harbor ancient branches within the domains of life.

Exploring the “undiscovered” classification is expected to be a boon for enriching the public data portals, Rubin said. He also noted that lurking among these difficult ones may well be the discovery of a “fourth domain” of life, to which a reasonable mariner, ancient or contemporary, may proclaim, “full speed ahead.”

Rubin presented recent work on “microbial dark matter” at the DOE Joint Genome Institute’s 2014 Genomics of Energy and Environment Meeting that can be viewed at The DOE JGI’s 10th Annual Meeting will be held March 24-26, 2015 and the list of preliminary speakers can be found here:

Dahl Winters's curator insight, December 18, 2014 8:00 AM

A big use of big data - exploring the genomes of life on Earth.  One of the biggest data sets in the world is the one we carry around with us and on us every day.

Scooped by Dr. Stefan Gruenwald!

Chemists Fabricate Novel Rewritable Paper Using Color Switching Redox Dyes

Chemists Fabricate Novel Rewritable Paper Using Color Switching Redox Dyes | Amazing Science |

First developed in China in about the year A.D. 150, paper has many uses, the most common being for writing and printing upon.  Indeed, the development and spread of civilization owes much to paper’s use as writing material. According to surveys, 90 percent of all information in businesses used today is retained on paper, even though the bulk of this printed paper is discarded after just one-time use. This is such a waste of paper and ink cartridges — not to mention the accompanying environmental problems such as deforestation and chemical pollution to air, water and land—could be curtailed if the paper were “rewritable,” that is, capable of being written on and erased multiple times.

Chemists at the University of California, Riverside have now fabricated in the lab just such novel rewritable paper, one that is based on the color switching property of commercial chemicals called redox dyes.  The dye forms the imaging layer of the paper.  Printing is achieved by using ultraviolet light to photobleach the dye, except the portions that constitute the text on the paper.  The new rewritable paper can be erased and written on more than 20 times with no significant loss in contrast or resolution.

“This rewritable paper does not require additional inks for printing, making it both economically and environmentally viable,” said Yadong Yin, a professor of chemistry, whose lab led the research. “It represents an attractive alternative to regular paper in meeting the increasing global needs for sustainability and environmental conservation.”

The rewritable paper is essentially rewritable media in the form of glass or plastic film to which letters and patterns can be repeatedly printed, retained for days, and then erased by simple heating.

The paper comes in three primary colors: blue, red and green, produced by using the commercial redox dyes methylene blue, neutral red and acid green, respectively.  Included in the dye are titania nanocrystals (these serve as catalysts) and the thickening agent hydroxyethyl cellulose (HEC).  The combination of the dye, catalysts and HEC lends high reversibility and repeatability to the film.

During the writing phase, ultraviolet light reduces the dye to its colorless state.  During the erasing phase, re-oxidation of the reduced dye recovers the original color; that is, the imaging material recovers its original color by reacting with ambient oxygen.  Heating at 115 C can speed up the reaction so that the erasing process is often completed in less than 10 minutes. “The printed letters remain legible with high resolution at ambient conditions for more than three days – long enough for practical applications such as reading newspapers,” Yin said. “Better still, our rewritable paper is simple to make, has low production cost, low toxicity and low energy consumption.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Strange Object in our Solar System: The Dwarf Planet Haumea

Strange Object in our Solar System: The Dwarf Planet Haumea | Amazing Science |

Haumea is the third closest dwarf planet to the Sun and is located beyond the orbit of Neptune. It has about 1/3 the mass of Pluto and was discovered in 2004 by a team of astronomers from Caltech at the Palomar Observatory in California working on a project headed by Mike Brown. However, Haumea was co-discovered in 2005 by a team headed by J. L. Ortiz at the Sierra Nevada Observatory in Spain.

Haumea claim to fame is because of its elongated shape making it the least spherical of all the dwarf planets. Haumea’s highly ellipsoid shape is believed to be the result of its rapid rotation. This rotational speed, along with its collisional origin make Haumea one of the densest dwarf planets discovered to date.

It was classified as dwarf planet by the International Astronomical Union (IAU) on September 17th, 2008 and was named after Haumea, the Hawaiian goddess of childbirth. Haumea has two small satellites by itself, called Hi’iaka & Namaka. These two little moons were discovered by Mike Brown’s team in 2005 through observations using the W.M. Keck Observatory.

Haumea’s moons are thought to be the result of a collision with a large object billions of years ago – causing pieces of Haumea to fragment and begin orbiting the planet. One day on Haumea lasts 3.9 Earth hours because it is one of the fastest rotating large objects in the solar system.

The dwarf planet is made from rock with a thick coating of ice. Haumea is the third brightest object in the Kuiper belt, after the dwarf planets Pluto and Makemake. On a clear night with a good quality telescope, it is possible to see Haumea in the night sky.

No comment yet.