Amazing Science
759.3K views | +128 today
Follow
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

The Making of a Julia Set: 3-D Fractals Offer Clues to Complex Systems

The Making of a Julia Set: 3-D Fractals Offer Clues to Complex Systems | Amazing Science | Scoop.it

To get a 3-D shape from an ordinary polynomial takes a little doing. The first step is to run the polynomial dynamically — that is, to iterate it by feeding each output back into the polynomial as the next input. One of two things will happen: either the values will grow infinitely in size, or they’ll settle into a stable, bounded pattern. To keep track of which starting values lead to which of those two outcomes, mathematicians construct the Julia set of a polynomial. The Julia set is the boundary between starting values that go off to infinity and values that remain bounded below a given value. This boundary line — which differs for every polynomial — can be plotted on the complex plane, where it assumes all manner of highly intricate, swirling, symmetric fractal designs.

 

If you shade the region bounded by the Julia set, you get the filled Julia set. If you use scissors and cut out the filled Julia set, you get the first piece of the surface of the eventual 3-D shape. To get the second, DeMarco and Lindsey wrote an algorithm. That algorithm analyzes features of the original polynomial, like its degree (the highest number that appears as an exponent) and its coefficients, and outputs another fractal shape that DeMarco and Lindsey call the “planar cap.”

 

“The Julia set is the base, like the southern hemisphere, and the cap is like the top half,” DeMarco said. “If you glue them together you get a shape that’s polyhedral.” The algorithm was Thurston’s idea. When he suggested it to Lindsey in 2010, she wrote a rough version of the program. She and DeMarco improved on the algorithm in their work together and “proved it does what we think it does,” Lindsey said. That is, for every filled Julia set, the algorithm generates the correct complementary piece.

 

The filled Julia set and the planar cap are the raw material for constructing a 3-D shape, but by themselves they don’t give a sense of what the completed shape will look like. This creates a challenge. When presented with the six faces of a cube laid flat, one could intuitively know how to fold them to make the correct 3-D shape. But, with a less familiar two-dimensional surface, you’d be hard-pressed to anticipate the shape of the resulting 3-D object.

 

“There’s no general mathematical theory that tells you what the shape will be if you start with different types of polygons,” Lindsey said. Mathematicians have precise ways of defining what makes a shape a shape. One is to know its curvature. Any 3-D object without holes has a total curvature of exactly 4π; it’s a fixed value in the same way any circular object has exactly 360 degrees of angle. The shape — or geometry — of a 3-D object is completely determined by the way that fixed amount of curvature is distributed, combined with information about distances between points. In a sphere, the curvature is distributed evenly over the entire surface; in a cube, it’s concentrated in equal amounts at the eight evenly spaced vertices.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists extend 'quantum machine learning' to infinite dimensions

Physicists extend 'quantum machine learning' to infinite dimensions | Amazing Science | Scoop.it

The researchers, Hoi-Kwan Lau et al., have published a paper on generalizing quantum machine learning to infinite dimensions in a recent issue of Physical Review Letters.

 

As the physicists explain, quantum machine learning is a new subfield within the field of quantum information that combines the speed of quantum computing with the ability to learn and adapt, as offered by machine learning.

 

One of the biggest advantages of having a quantum machine learning algorithm for continuous variables is that it can theoretically operate much faster than classical algorithms. Since many science and engineering models involve continuous variables, applying quantum machine learning to these problems could potentially have far-reaching applications.

 

"Our work demonstrates the ability to take advantage of photonics to perform machine learning tasks on a quantum computer that could far exceed the speed of any conventional computer," coauthor George Siopsis at the University of Tennessee told Phys.org. "Quantum machine learning also offers potential advantages such as lower energy requirements owing to the ability to store more information per qubit, and a very low cost per qubit compared to other technologies."

 

Most quantum machine learning algorithms developed so far work only with problems involving discrete variables. Applying quantum machine learning to continuous-variable problems requires a very different approach.

 

To do this, the physicists had to develop a new set of tools that work with continuous variables. This involves replacing the logic gates that are used for discrete-variable states with physical gates, which work for continuous-variable states. Building up from these basic building blocks of the algorithm, the scientists then developed new methods that power the quantum machine learning problems, called subroutines, which are represented by matrices and vectors.

 

Although the results of the study are purely theoretical, the physicists expect that the new algorithm for continuous variables could be experimentally implemented using currently available technology. The implementation could be done in several ways, such as by using optical systems, spin systems, or trapped atoms. Regardless of the type of system, the implementation would be challenging. For example, an optical implementation that the scientists outlined here would require some of the latest technologies, such as "cat states" (a superposition of the "0" and "1" states) and high rates of squeezing (to reduce quantum noise).

 

In the future, the scientists hope to further investigate how continuous-variable quantum machine learning can be extended to replicate some of the latest results involving discrete variables. Another interesting avenue to pursue is a hybrid approach, which would combine the methods of both discrete and continuous variables in a single algorithm.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Making math more Lego-like by using a 3D-pictographical language

Making math more Lego-like by using a 3D-pictographical language | Amazing Science | Scoop.it

3-D picture-language has far-reaching potential, including in physics. A trio of Harvard researchers has developed a new 3-D pictorial language for mathematics with potential as a tool across a wide spectrum, from pure math to physics.

 

The trio presented a 3D topological picture-language for quantum information, called quon. Their approach combines charged excitations carried by strings, with topological properties that arise from embedding the strings in the interior of a 3D manifold with boundary. A quon is a composite that acts as a particle. Specifically, a quon is a hemisphere containing a neutral pair of open strings with opposite charge. The mathematicians interpreted multiquons and their transformations in a natural way. They obtained a type of relation, a string–genus “joint relation,” involving both a string and the 3D manifold. They used the joint relation to obtain a topological interpretation of the C∗-Hopf algebra relations, which are currently widely used in tensor networks. The team obtained a 3D representation of the controlled NOT (CNOT) gate that is considerably simpler than earlier work, and a 3D topological protocol for teleportation.

 

In the past, topological quantum information was formulated by Kitaev (1) and Freedman et al. (2).

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from DNA and RNA research
Scoop.it!

Electrons Use DNA Like a Wire for Signaling DNA Replication 

Electrons Use DNA Like a Wire for Signaling DNA Replication  | Amazing Science | Scoop.it

A Caltech-led study has shown that the electrical wire-like behavior of DNA is involved in the molecule's replication.

 

In the early 1990s, Jacqueline Barton, the John G. Kirkwood and Arthur A. Noyes Professor of Chemistry at Caltech, discovered an unexpected property of DNA—that it can act like an electrical wire to transfer electrons quickly across long distances. Later, she and her colleagues showed that cells take advantage of this trait to help locate and repair potentially harmful mutations to DNA.

 

Now, Barton's lab has shown that this wire-like property of DNA is also involved in a different critical cellular function: replicating DNA. When cells divide and replicate themselves in our bodies—for example in the brain, heart, bone marrow, and fingernails—the double-stranded helix of DNA is copied. DNA also copies itself in reproductive cells that are passed on to progeny.

 

The new Caltech-led study, based on work by graduate student Elizabeth O'Brien in collaboration with Walter Chazin's group at Vanderbilt University, shows that a key protein required for replicating DNA depends on electrons traveling through DNA.

 

"Nature is the best chemist and knows exactly how to take advantage of DNA electron-transport chemistry," says Barton, who is also the Norman Davidson Leadership Chair of Caltech's Division of Chemistry and Chemical Engineering. "The electron transfer process in DNA occurs very quickly," says O'Brien, lead author of the study, appearing in the February 24 issue of Science. "It makes sense that the cell would utilize this quick-acting pathway to regulate DNA replication, which necessarily is a very rapid process."

 

The researchers found their first clue that DNA replication might involve the transport of electrons through the double helix by taking a closer look at the proteins involved. Two of the main players in DNA replication, critical at the start of the process, are the proteins DNA primase and DNA polymerase alpha. DNA primase typically binds to single-stranded, uncoiled DNA to begin the replication process. It creates a "primer" made of RNA to help DNA polymerase alpha start its job of copying the single strand of DNA to create a new segment of double-helical DNA.

 

DNA primase and DNA polymerase alpha molecules both contain iron-sulfur clusters. Barton and her colleagues previously discovered that these metal clusters are crucial for DNA electron transport in DNA repair. In DNA repair, specific proteins send electrons down the double helix to other DNA-bound repair proteins as a way to "test the line," so to speak, and make sure there are no mutations in the DNA. If there are mutations, the line is essentially broken, alerting the cell that mutations are in need of repair. The iron-sulfur clusters in the DNA repair proteins are responsible for donating and accepting traveling electrons.


Via Integrated DNA Technologies
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists detect friction-like force in vacuum

Physicists detect friction-like force in vacuum | Amazing Science | Scoop.it

When three physicists first discovered through their calculations that a decaying atom moving through the vacuum experiences a friction-like force, they were highly suspicious. The results seemed to go against the laws of physics: The vacuum, by definition, is completely empty space and does not exert friction on objects within it. Further, if true, the results would contradict the principle of relativity, since they would imply that observers in two different reference frames would see the atom moving at different speeds (most observers would see the atom slow down due to friction, but an observer moving with the atom would not).

 

Writing in Physical Review Letters, physicists Matthias Sonnleitner, Nils Trautmann, and Stephen M. Barnett at the University of Glasgow knew something must be wrong, but at first they weren't sure what. "We spent ages searching for the mistake in the calculation and spent even more time exploring other strange effects until we found this (rather simple) solution," Sonnleitner explained.

 

The physicists eventually realized that the missing puzzle piece was a tiny bit of extra mass called the "mass defect"—an amount so tiny that it has never been measured in this context. This is the mass in Einstein's famous equation E = mc2, which describes the amount of energy required to break up the nucleus of an atom into its protons and neutrons. This energy, called the "internal binding energy," is regularly accounted for in nuclear physics, which deals with larger binding energies, but is typically considered negligible in the context of atom optics (the field here) because of the much lower energies.

 

This subtle but important detail allowed the researchers to paint a very different picture of what was going on. As a decaying atom moves through the vacuum, it really does experience some kind of force resembling friction. But a true friction force would cause the atom to slow down, and this is not what's happening.

 

What's really happening is that, since the moving atom loses a tiny bit of mass as it decays, it loses momentum, not velocity. To explain in more detail: Although the vacuum is empty and does not exert any forces on the atom, it still interacts with the atom, and this interaction causes the excited atom to decay. As the moving atom decays to a lower energy state, it emits photons, causing it to lose a little bit of energy corresponding to a certain amount of mass. Since momentum is the product of mass and velocity, the decrease in mass causes the atom to lose a little bit of momentum, just as expected according to the conservation of energy and momentum in special relativity. So while the atom's mass (energy) and momentum decrease, its velocity remains constant.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

How Life (and Death) Spring From Disorder

How Life (and Death) Spring From Disorder | Amazing Science | Scoop.it

Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.

 

Living organisms seem rather like Maxwell’s demon. Whereas a beaker full of reacting chemicals will eventually expend its energy and fall into boring stasis and equilibrium, living systems have collectively been avoiding the lifeless equilibrium state since the origin of life about three and a half billion years ago. They harvest energy from their surroundings to sustain this nonequilibrium state, and they do it with “intention.” Even simple bacteria move with “purpose” toward sources of heat and nutrition. In his 1944 bookWhat is Life?, the physicist Erwin Schrödinger expressed this by saying that living organisms feed on “negative entropy.”

 

They achieve it, Schrödinger said, by capturing and storing information. Some of that information is encoded in their genes and passed on from one generation to the next: a set of instructions for reaping negative entropy. Schrödinger didn’t know where the information is kept or how it is encoded, but his intuition that it is written into what he called an “aperiodic crystal”inspired Francis Crick, himself trained as a physicist, and James Watson when in 1953 they figured out how genetic information can be encoded in the molecular structure of the DNA molecule.

 

A genome, then, is at least in part a record of the useful knowledge that has enabled an organism’s ancestors — right back to the distant past — to survive on our planet. According to David Wolpert, a mathematician and physicist at the Santa Fe Institute who convened the recent workshop, and his colleagueArtemy Kolchinsky, the key point is that well-adapted organisms are correlated with that environment. If a bacterium swims dependably toward the left or the right when there is a food source in that direction, it is better adapted, and will flourish more, than one  that swims in random directions and so only finds the food by chance. A correlation between the state of the organism and that of its environment implies that they share information in common. Wolpert and Kolchinsky say that it’s this information that helps the organism stay out of equilibrium — because, like Maxwell’s demon, it can then tailor its behavior to extract work from fluctuations in its surroundings. If it did not acquire this information, the organism would gradually revert to equilibrium: It would die.

 

Looked at this way, life can be considered as a computation that aims to optimize the storage and use of meaningful information. And life turns out to be extremely good at it. Landauer’s resolution of the conundrum of Maxwell’s demon set an absolute lower limit on the amount of energy a finite-memory computation requires: namely, the energetic cost of forgetting. The best computers today are far, far more wasteful of energy than that, typically consuming and dissipating more than a million times more. But according to Wolpert, “a very conservative estimate of the thermodynamic efficiency of the total computation done by a cell is that it is only 10 or so times more than the Landauer limit.”

The implication, he said, is that “natural selection has been hugely concerned with minimizing the thermodynamic cost of computation. It will do all it can to reduce the total amount of computation a cell must perform.” In other words, biology (possibly excepting ourselves) seems to take great care not to overthink the problem of survival. This issue of the costs and benefits of computing one’s way through life, he said, has been largely overlooked in biology so far.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Science and Global Education Trends
Scoop.it!

Physicists Make the Case That Our Brains' Learning Is Controlled by Entropy

Physicists Make the Case That Our Brains' Learning Is Controlled by Entropy | Amazing Science | Scoop.it

The way our brains learn new information has puzzled scientists for decades - we come across so much new information daily, how do our brains store what's important, and forget the rest more efficiently than any computer we've built?

 

It turns out that this could be controlled by the same laws that govern the formation of the stars and the evolution of the Universe, because a team of physicists has shown that, at the neuronal level, the learning process could ultimately be limited by the laws of thermodynamics.

 

"The greatest significance of our work is that we bring the second law of thermodynamics to the analysis of neural networks," lead researcher Sebastian Goldt from the University of Stuttgart in Germany told Lisa Zyga from Phys.org. 

 

The second law of thermodynamics is one of the most famous physics laws we have, and it states that the total entropy of an isolated system always increases over time.

 

Entropy is a thermodynamic quantity that's often referred to as a measure of disorder in a system. What that means is that, without extra energy being put into a system, transformations can't be reversed - things are going to get progressively more disordered, because it's more efficient that way.

 

Entropy is currently the leading hypothesis for why the arrow of time only ever marches forwards. The second law of thermodynamics says that you can't un-crack an egg, because it would lower the Universe's entropy, and for that reason, there will always be a future and a past.

 

But what does this have to do with the way our brains learn? Just like the bonding of atoms and the arrangement of gas particles in stars, our brains find the most efficient way to organise themselves.

 

"The second law is a very powerful statement about which transformations are possible - and learning is just a transformation of a neural network at the expense of energy," Goldt explained to Zyga.


Via Kathy Bosiak
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Cosmic test backs 'quantum spookiness'

Cosmic test backs 'quantum spookiness' | Amazing Science | Scoop.it
Physicists harness starlight to support the case for entanglement.

 

A version of an iconic experiment to confirm quantum theory has for the first time used the light of distant stars to bolster the case for a phenomenon that Albert Einstein referred to as “spooky action at a distance”.

 

Einstein disliked the notion that objects can share a mysterious connection across any distance of space, and scientists have spent the past 50 years trying to make sure that their results showing this quantum effect could not have been caused by more intuitive explanations.

 

Quantum physics suggests that two so-called entangled particles can maintain a special connection — even at a large distance — such that if one is measured, that instantly tells an experimenter what measuring the other particle will show. This happens despite the fact neither particle has definite properties until it is measured. That unsettled some physicists, including Einstein, who favored an alternative explanation: that quantum theory is incomplete, and that the outcomes instead depend on some predetermined, but hidden, variables.

 

The latest effort to explore the phenomenon, to be published1 in Physical Review Letters on 7 February, uses light emitted by stars around 600 years ago to select which measurements to make in a quantum experiment known as a Bell test. In doing so, they narrow down the point in history when, if they exist, hidden variables could have influenced the experiment.

 

“It’s a beautiful experiment,” says Krister Shalm, a quantum physicist at the US National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. Although few expected it to disprove quantum mechanics, such experiments “keep pushing alternative theories to be more and more contrived and ridiculous”, he says. Similar techniques could, in the future, help to protect against hackers who try to crack quantum-cryptography systems, he adds.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Quantum phase transition observed for the first time

Quantum phase transition observed for the first time | Amazing Science | Scoop.it

A group of scientists led by Johannes Fink from the Institute of Science and Technology Austria (IST Austria) reported the first experimental observation of a first-order phase transition in a dissipative quantum system. Phase transitions include such phenomena as the freezing of water at the critical temperature of 0 degrees Celsius. However, phase transitions also occur at the quantum mechanical level, where they are still relatively unexplored by researchers.

 

One example of a phase transition at the quantum level is the photon-blockade breakdown, which was only discovered two years ago. During photon blockade, a photon fills a cavity in an optical system and prevents other photons from entering the same cavity until it leaves, hence blocking the flow of photons. But if the photon flux increases to a critical level, a quantum phase transition is predicted: The photon blockade breaks down, and the state of the system changes from opaque to transparent. This specific phase transition has now been experimentally observed by researchers who, for the first time, met the very specific conditions necessary to study this effect.

 

During a phase transition, the continuous tuning of an external parameter, for example temperature, leads to a transition between two robust steady states with different attributes. First-order phase transitions are characterized by a coexistence of the two stable phases when the control parameter is within a certain range close to the critical value. The two phases form a mixed phase in which some parts have completed the transition and others have not, as in a glass containing ice water. The experimental results that Fink and his collaborators will publish in the journal Physical Review X give insight into the quantum mechanical basis of this effect in a microscopic, zero-dimensional system.

 

Their setup consisted of a microchip with a superconducting microwave resonator acting as the cavity and a few superconducting qubits acting as the atoms. The chip was cooled to a temperature astoundingly close to absolute zero—0.01 Kelvin—so that thermal fluctuations did not play a role. To produce a flux of photons, the researchers then sent a continuous microwave tone to the input of the resonator on the chip. On the output side, they amplified and measured the transmitted microwave flux. For certain input powers, they detected a signal flipping stochastically between zero transmission and full transmission, proving the expected coexistence of both phases had occurred. "We have observed this random switching between opaque and transparent for the first time and in agreement with theoretical predictions," says lead author Johannes Fink from IST Austria.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

First observational evidence that our universe could be a vast and complex hologram

First observational evidence that our universe could be a vast and complex hologram | Amazing Science | Scoop.it
A UK, Canadian and Italian study has provided what researchers believe is the first observational evidence that our universe could be a vast and complex hologram.

 

Theoretical physicists and astrophysicists, investigating irregularities in the cosmic microwave background (the 'afterglow' of the Big Bang), have found there is substantial evidence supporting a holographic explanation of the universe—in fact, as much as there is for the traditional explanation of these irregularities using the theory of cosmic inflation.

 

The researchers, from the University of Southampton (UK), University of Waterloo (Canada), Perimeter Institute (Canada), INFN, Lecce (Italy) and the University of Salento (Italy), have published findings in the journal Physical Review Letters.

A holographic universe, an idea first suggested in the 1990s, is one where all the information that makes up our 3-D 'reality' (plus time) is contained in a 2-D surface on its boundaries.

 

Prof Kostas Skenderis of Mathematical Sciences at the University of Southampton explains: "Imagine that everything you see, feel and hear in three dimensions (and your perception of time) in fact emanates from a flat two-dimensional field. The idea is similar to that of ordinary holograms where a three-dimensional image is encoded in a two-dimensional surface, such as in the hologram on a credit card. However, this time, the entire universe is encoded."

 

Although not an example with holographic properties, it could be thought of as rather like watching a 3-D film in a cinema. We see the pictures as having height, width and crucially, depth—when in fact it all originates from a flat 2-D screen. The difference, in our 3-D universe, is that we can touch objects and the 'projection' is 'real' from our perspective.

 

In recent decades, advances in telescopes and sensing equipment have allowed scientists to detect a vast amount of data hidden in the 'white noise' or microwaves (partly responsible for the random black and white dots you see on an un-tuned TV) left over from the moment the universe was created. Using this information, the team were able to make complex comparisons between networks of features in the data and quantum field theory. They found that some of the simplest quantum field theories could explain nearly all cosmological observations of the early universe.

 

Prof Skenderis comments: "Holography is a huge leap forward in the way we think about the structure and creation of the universe. Einstein's theory of general relativity explains almost everything large scale in the universe very well, but starts to unravel when examining its origins and mechanisms at quantum level. Scientists have been working for decades to combine Einstein's theory of gravity and quantum theory. Some believe the concept of a holographic universe has the potential to reconcile the two. I hope our research takes us another step towards this."

 

The scientists now hope their study will open the door to further our understanding of the early universe and explain how space and time emerged.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Light-speed camera snaps light’s “sonic boom” for the first time

Light-speed camera snaps light’s “sonic boom” for the first time | Amazing Science | Scoop.it

A light-speed event requires an even faster camera. A new camera setup has captured the first film of a photonic Mach cone – basically, a sonic boom with light – in real time.

“Our camera is different from a common camera where you just take a snapshot and record one image: our camera works by first capturing all the images of a dynamic event into one snapshot. And then we reconstruct them, one by one,” says Jinyang Liang at Washington University in St Louis.

The technique, called “lossless-encoding compressed ultrafast photography” (LLE-CUP), captures 100 billion frames per second, allowing it to create real-time video of scattering light with a single snapshot.

 

Einstein’s theory of relativity forbids anything from travelling faster than the speed of light. So Liang and his colleagues used a trick to mimic a beam of light breaking its own speed limit. They shot a laser through a tunnel filled with dry ice fog, which was flanked by two silicone rubber panels. Because light travels through silicone more slowly than through fog, the laser pulse left a shock wave trailing behind it in a cone shape.

 

Ultrafast imaging is already used in medicine and the study of light, but it usually requires multiple snapshots, meaning that the event being recorded needs to be precisely repeatable. That’s not always possible in the real world. By capturing the whole thing in one go, the LLE-CUP system eliminates that problem, and also lets researchers analyze any extra scattering of light that would distort their image.

 

“The whole thing about biomedical imaging is that tissue scatters light – that’s why we’re not transparent – so that degrades information content,” says Bruce Tromberg, a professor of surgery and biomedical engineering at the University of California, Irvine. With LLE-CUP, we can separate the scattering from irrelevant tissues and isolate the light’s interactions with specific cells.

 

The system could be used with standard cameras, microscopes and even telescopes, Liang says. It could help detect the very small, like neurons firing or cancer cells, and the extremely large, like changes in the light within a supernova. “It’s got a very high ‘wow, this is amazing’ factor,” Tromberg says.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Metallic hydrogen, once theory, becomes reality

Metallic hydrogen, once theory, becomes reality | Amazing Science | Scoop.it
Nearly a century after it was theorized, Harvard scientists have succeeded in creating the rarest - and potentially one of the most valuable - materials on the planet.

 

The material - atomic metallic hydrogen - was created by Thomas D. Cabot Professor of the Natural Sciences Isaac Silvera and post-doctoral fellow Ranga Dias. In addition to helping scientists answer fundamental questions about the nature of matter, the material is theorized to have a wide range of applications, including as a room-temperature superconductor. The creation of the rare material is described in a January 26 2017 paper published in Science.

 

"This is the holy grail of high-pressure physics," Silvera said. "It's the first-ever sample of metallic hydrogen on Earth, so when you're looking at it, you're looking at something that's never existed before."

 

To create it, Silvera and Dias squeezed a tiny hydrogen sample at 495 gigapascal, or more than 71.7 million pounds-per-square inch - greater than the pressure at the center of the Earth. At those extreme pressures, Silvera explained, solid molecular hydrogen -which consists of molecules on the lattice sites of the solid - breaks down, and the tightly bound molecules dissociate to transforms into atomic hydrogen, which is a metal.

 

While the work offers an important new window into understanding the general properties of hydrogen, it also offers tantalizing hints at potentially revolutionary new materials.

more...
John Myrick's curator insight, January 27, 9:23 AM
All new materials physics!
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Exotic black holes caught turning into a superfluid

Exotic black holes caught turning into a superfluid | Amazing Science | Scoop.it

The black holes in our universe may seem like bizarre, voracious beasts – but stranger ones are possible. Simulations of black holes have revealed the first superfluid specimen.

 

Superfluids are a form of matter that take mere melting one step further. When a solid turns to a liquid, what was once sturdy and inflexible begins to flow. Superfluids have zero stickiness or viscosity: they can even flow uphill. They also have completely uniform temperature.

 

But superfluids are extremely difficult to create. Only liquid helium has been coaxed into going superfluid, and then only at temperatures close to absolute zero. The stuff is even harder to study or model: many of the important calculations are ones that nobody knows how to do yet.

 

Now, Robert Mann at the University of Waterloo in Canada and his colleagues have modelled a theoretical black hole that changes in a way that’s mathematically identical to what liquid helium does when it turns superfluid.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science
Scoop.it!

OLYMPUS experiment sheds light on structure of protons

OLYMPUS experiment sheds light on structure of protons | Amazing Science | Scoop.it
A mystery concerning the structure of protons is a step closer to being solved, thanks to a seven-year experiment led by researchers at MIT. The findings suggest that two photons, not one, are exchanged in electron-proton interactions.

 

For many years researchers have probed the structure of protons — subatomic particles with a positive charge — by bombarding them with electrons and examining the intensity of the scattered electrons at different angles. In this way they have attempted to determine how the proton’s electric charge and magnetization are distributed. These experiments had previously led researchers to assume that the electric and magnetic charge distributions are the same, and that one photon — an elementary particle of light — is exchanged when the protons interact with the bombarding electrons.

 

However, in the early 2000s, researchers began to carry out experiments using polarized electron beams, which measure electron-proton elastic scattering using the spin of the protons and electrons. These experiments revealed that the ratio of electric to magnetic charge distributions decreased dramatically with higher-energy interactions between the electrons and protons. This led to the theory that not one but two photons were sometimes being exchanged during the interaction, causing the uneven charge distribution. What’s more, the theory predicted that both of these particles would be so-called “hard,” or high-energy photons.

 

In a bid to identify this “two-photon exchange,” an international team led by researchers in the Laboratory for Nuclear Science at MIT carried out a seven-year experiment, known as OLYMPUS, at the German Electron Synchrotron (DESY) in Hamburg.

 

In a paper published this week in the journal Physical Review Letters, the researchers reveal the results of this experiment, which indicate that two photons are indeed exchanged during electron-proton interactions.

 

However, unlike the theoretical predictions, analysis of the OLYMPUS measurements suggests that, most of the time, only one of the photons has high energy, while the other must carry very little energy indeed, according to Richard Milner, a professor of physics and member of the Laboratory for Nuclear Science’s Hadronic Physics Group, who led the experiment. “We saw little if no evidence for a hard two-photon exchange,” Milner says.

 

Having proposed the idea for the experiment in the late 2000s, the group was awarded funding in 2010. The researchers had to disassemble the former BLAST spectrometer — a complex 125-cubic-meter-sized detector based at MIT — and transport it to Germany, where it was reassembled with some improvements. They then carried out the experiment over three months in 2012, before the particle accelerator at the laboratory was itself decommissioned and shut down at the end of that year.

 

The experiment, which was carried out at the same time as two others in the U.S. and Russia, involved bombarding the protons with both negatively charged electrons and positively charged positrons, and comparing the difference between the two interactions, according to Douglas Hasell, a principal research scientist in the Laboratory for Nuclear Science and the Hadronic Physics Group at MIT, and another of the paper’s authors.

 

The process will produce a subtly different measurement depending on whether the protons are scattered by electrons or positrons, Hasell says. “If you see a difference (in the measurements), it would indicate that there is a two-photon effect that is significant.” The collisions were run for three months, and the resulting data took a further three years to analyze, Hasell says. The difference between the theoretical and experimental results means further experiments may need to be carried out in the future, at even higher energies where the two-photon exchange effect is expected to be larger, Hasell says.


Via Mariaschnee
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Yale-led team puts dark matter on the map

Yale-led team puts dark matter on the map | Amazing Science | Scoop.it

A Yale-led team has produced one of the highest-resolution maps of dark matter ever created, offering a detailed case for the existence of cold dark matter -- sluggish particles that comprise the bulk of matter in the universe.

 

The dark matter map is derived from Hubble Space Telescope Frontier Fields data of a trio of galaxy clusters that act as cosmic magnifying glasses to peer into older, more distant parts of the universe, a phenomenon known as gravitational lensing.

 

Yale astrophysicist Priyamvada Natarajan led an international team of researchers that analyzed the Hubble images. "With the data of these three lensing clusters we have successfully mapped the granularity of dark matter within the clusters in exquisite detail," Natarajan said. "We have mapped all of the clumps of dark matter that the data permit us to detect, and have produced the most detailed topological map of the dark matter landscape to date."

 

Scientists believe dark matter -- theorized, unseen particles that neither reflect nor absorb light, but are able to exert gravity -- may comprise 80% of the matter in the universe. Dark matter may explain the very nature of how galaxies form and how the universe is structured. Experiments at Yale and elsewhere are attempting to identify the dark matter particle; the leading candidates include axions and neutralinos.

 

"While we now have a precise cosmic inventory for the amount of dark matter and how it is distributed in the universe, the particle itself remains elusive," Natarajan said.

 

Dark matter particles are thought to provide the unseen mass that is responsible for gravitational lensing, by bending light from distant galaxies. This light bending produces systematic distortions in the shapes of galaxies viewed through the lens. Natarajan's group decoded the distortions to create the new dark matter map.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists Uncover Geometric ‘Theory Space’

Physicists Uncover Geometric ‘Theory Space’ | Amazing Science | Scoop.it
A decades-old method called the “bootstrap” is enabling new discoveries about the geometry underlying all quantum theories.

 

In the 1960s, the charismatic physicist Geoffrey Chew, Member (1956) in the School of Mathematics/Natural Sciences, espoused a radical vision of the universe, and with it, a new way of doing physics, arguing that "Nature is as it is because this is the only possible nature consistent with itself.” He believed he could deduce nature’s laws solely from the demand that they be self-consistent. Particles, Chew said, “pull themselves up by their own bootstraps.”

 

Recently, the bootstrap method has been re-energized. As the new generation of bootstrappers, including Professor Nima Arkani-Hamed and Carl P. Feinberg Professor Juan Maldacena, current Member David Simmons-Duffin, Member (2010–13) Thomas Hartman, and Junior Visiting Professor (2015–16) and Member (2011–12) David Poland in the School of Natural Sciences, explore this abstract theory space, they seem to be verifying the vision that Chew, now 92 and long retired, laid out half a century ago—but they’re doing it in an unexpected way.

 

As physicists use the bootstrap to explore the geometry of this theory space, they are pinpointing the roots of “universality,” a remarkable phenomenon in which identical behaviors emerge in materials as different as magnets and water. They are also discovering general features of quantum gravity theories, with apparent implications for the quantum origin of gravity in our own universe and the origin of space-time itself. The bootstrap is technically a method for computing “correlation functions” — formulas that encode the relationships between the particles described by a quantum field theory. Consider a chunk of iron. The correlation functions of this system express the likelihood that iron atoms will be magnetically oriented in the same direction, as a function of the distances between them. The two-point correlation function gives you the likelihood that any two atoms will be aligned, the three-point correlation function encodes correlations between any three atoms, and so on. These functions tell you essentially everything about the iron chunk. But they involve infinitely many terms riddled with unknown exponents and coefficients. They are, in general, onerous to compute. The bootstrap approach is to try to constrain what the terms of the functions can possibly be in hopes of solving for the unknown variables. Most of the time, this doesn’t get you far. But in special cases, as the theoretical physicist Alexander Polyakov began to figure out in 1970, the bootstrap takes you all the way.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Chiral superconductivity experimentally demonstrated for the first time

Chiral superconductivity experimentally demonstrated for the first time | Amazing Science | Scoop.it

Scientists have found that a superconducting current flows in only one direction through a chiral nanotube, marking the first observation of the effects of chirality on superconductivity. Until now, superconductivity has only been demonstrated in achiral materials, in which the current flows in both directions equally.

 

The team of researchers, F. Qin et al., from Japan, the US, and Israel, have published a paper on the first observation of chiral superconductivity in a recent issue of Nature Communications.

Chiral superconductivity combines two typically unrelated concepts in a single material: Chiral materials have mirror images that are not identical, similar to how left and right hands are not identical because they cannot be superimposed one on top of the other. And superconducting materials can conduct an electric current with zero resistance at very low temperatures.

 

Observing chiral superconductivity has been experimentally challenging due to the material requirements. Although carbon nanotubes are superconducting, chiral, and commonly available, so far researchers have only successfully demonstrated superconducting electron transport in nanotube assemblies and not in individual nanotubes, which are required for this purpose.

 

"The most important significance of our work is that superconductivity is realized in an individual nanotube for the first time," coauthor Toshiya Ideue at The University of Tokyo told Phys.org. "It enables us to search for exotic superconducting properties originating from the characteristic (tubular or chiral) structure."

 

The achievement is only possible with a new two-dimensional superconducting material called tungsten disulfide, a type of transition metal dichalcogenide, which is a new class of materials that have potential applications in electronics, photonics, and other areas. The tungsten disulfide nanotubes are superconducting at low temperatures using a method called ionic liquid gating and also have a chiral structure. In addition, it's possible to run a superconducting current through an individual tungsten disulfide nanotube.

 

When the researchers ran a current through one of these nanotubes and cooled the device down to 5.8 K, the current became superconducting—in this case, meaning its normal resistance dropped by half. When the researchers applied a magnetic field parallel to the nanotube, they observed small antisymmetric signals that travel in one direction only. These signals are negligibly small in nonchiral superconducting materials, and the researchers explain that the chiral structure is responsible for strongly enhancing these signals.

 

"The asymmetric electric transport is realized only when a magnetic field is applied parallel to the tube axis," Ideue said. "If there is no magnetic field, current should flow symmetrically. We note that electric current should be asymmetric (if the magnetic field is applied parallel to the tube axis) even in the normal state (non-superconducting region), but we could not see any discernible signals in the normal state yet, interestingly, it shows a large enhancement in the superconducting region."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Increasing the sensitivity of next-generation gravitational wave detectors

Increasing the sensitivity of next-generation gravitational wave detectors | Amazing Science | Scoop.it

Nearly one year ago today, the LIGO Collaboration announced the detection of gravitational waves, once again confirming Einstein's theory of General Relativity. This important discovery by the Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) has spurred great interest in improving these advanced optical detectors. The mission of gravitational wave scientists worldwide is to make gravitational wave detection a routine occurrence. Scientists from the institute that developed the lasers used in Advanced LIGO have made significant progress to support that goal.

 

The advanced LIGO is a 2.5-mile long optical device known as an interferometer that uses laser light to detect gravitational waves coming from distant cosmic events such as colliding black holes or collapsing stars. Improving the stability of the laser source and decreasing noise that can hide weak signals coming from gravitational waves could help improve the sensitivity of gravitational wave detectors.

 

"We have made significant progress towards stable laser sources for third-generation gravitational wave detectors and prototypes of those," said Benno Willke of the Max Planck Institute for Gravitational Physics (Albert Einstein Institute) and Leibniz Universität Hannover, leader of the research team. "More stable lasers enable interferometers to sense gravitational waves that are weaker and from sources further away and thus reveal important insights into astrophysical events involving black holes and neutron stars."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists calculate signal of gravitational wave sources

Scientists calculate signal of gravitational wave sources | Amazing Science | Scoop.it

Scientists have calculated the ancient signals that emerged just after the Big Bang. These signals come from a long-lost cosmological phenomena known as ‘oscillons,’ the gravitational wave sources from just fractions of a second after the birth of the universe. While oscillons have since disappeared, the gravitational waves they gave off have not, and the researchers say these can be used to look further into the history of the universe.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Many ways to spin a photon: Half-quantization of a total optical angular momentum

Many ways to spin a photon: Half-quantization of a total optical angular momentum | Amazing Science | Scoop.it

The angular momentum of light plays an important role in many areas, from optical trapping to quantum information. In the usual three-dimensional setting, the angular momentum quantum numbers of the photon are integers, in units of the Planck constantħ. A group of scientists now show that, in reduced dimensions, photons can have a half-integer total angular momentum. They identify a new form of total angular momentum, carried by beams of light, comprising an unequal mixture of spin and orbital contributions. The scientists demonstrate the half-integer quantization of this total angular momentum using noise measurements. They conclude that for light, as is known for electrons, reduced dimensionality allows new forms of quantization.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science
Scoop.it!

Supercomputing, experiment combine for first look at magnetism of real nanoparticle

Supercomputing, experiment combine for first look at magnetism of real nanoparticle | Amazing Science | Scoop.it

Barely wider than a strand of human DNA, magnetic nanoparticles -- such as those made from iron and platinum atoms -- are promising materials for next-generation recording and storage devices like hard drives. Building these devices from nanoparticles should increase storage capacity and density, but understanding how magnetism works at the level of individual atoms is critical to getting the best performance.

 

However, magnetism at the atomic scale is extremely difficult to observe experimentally, even with the best microscopes and imaging technologies. That's why researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy's (DOE's) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE's Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.

 

"These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles," said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.

 

Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL's Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab's Molecular Foundry to combine world-class experimental data with world-class computing to do something new--simulate magnetism atom by atom in a real nanoparticle.

 

Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.

 

Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.


Via Mariaschnee
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Quantum RAM: Modelling the big questions with the very small

Quantum RAM: Modelling the big questions with the very small | Amazing Science | Scoop.it

Griffith's Professor Geoff Pryde, who led the project, says that such processes could be simulated using a "quantum hard drive", much smaller than the memory required for conventional simulations.

 

"Stephen Hawking once stated that the 21st century is the 'century of complexity', as many of today's most pressing problems, such as understanding climate change or designing transportation system, involve huge networks of interacting components," he says.

 

"Their simulation is thus immensely challenging, requiring storage of unprecedented amounts of data. What our experiments demonstrate is a solution may come from quantum theory, by encoding this data into a quantum system, such as the quantum states of light."

 

Einstein once said that "God does not play dice with the universe," voicing his disdain with the idea that quantum particles contain intrinsic randomness. "But theoretical studies showed that this intrinsic randomness is just the right ingredient needed to reduce the memory cost for modelling partially random statistics," says Dr Mile Gu, a member of the team who developed the initial theory.

 

In contrast with the usual binary storage system - the zeroes and ones of bits - quantum bits can be simultaneously 0 and 1, a phenomenon known as quantum superposition.

 

The researchers, in their paper published in Science Advances, say this freedom allows quantum computers to store many different states of the system being simulated in different superpositions, using less memory overall than in a classical computer. The team constructed a proof-of-principle quantum simulator using a photon - a single particle of light - interacting with another photon. They measured the memory requirements of this simulator, and compared it with the fundamental memory requirements of a classical simulator, when used to simulate specified partly random processes.

 

The data showed that the quantum system could complete the task with much less information stored than the classical computer- a factor of 20 improvements at the best point. "Although the system was very small - even the ordinary simulation required only a single bit of memory - it proved that quantum advantages can be achieved," Pryde says.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Molecular fountain my lead to more precise measurement of physical constants

Molecular fountain my lead to more precise measurement of physical constants | Amazing Science | Scoop.it

A team of researchers at Vrije Universiteit Amsterdam has built, for the first time, a molecular fountain. The group has published a paper in the journal Physical Review Letters describing how they created the fountain, how it works and their ideas on how it might be used to more precisely measure physical constants.

 

Scientists developed atomic fountains back in the 1980s and since that time they have been applied to a myriad of applications, the most well-known example likely being the atomic clock. The purpose of an atomic fountain is to allow for measuring the characteristics of atoms moving at relatively slow speeds. The slowed speeds are due to the way the fountain works—atoms are cooled to a very low temperature and are then shot upwards where they eventually slow, stop and begin to fall due to the force of gravity. An atomic clock works by setting an atom's internal state before it is shot upwards and then noting the minute change to its internal state as it comes back down.

 

Scientists would like to have access to a similar fountain that works at the molecular level, because they believe it could be used to more accurately measure physical constants, which in turn could help in stringent testing of the Standard Model. Unfortunately, until now, that was not possible because of the difficulty in cooling molecules without causing them to spread out. In this new effort, the researchers have overcome that problem.

 

To create the molecular fountain, the researchers cooled ammonia molecules by combining two prior techniques and applying them to a molecular beam. The first involved applying voltages in a rapidly switching manner to remove energy from the beam. The second involved applying high voltage that was smoothly varied to allow for continually slowing the potential of the beam as well as its speed. Once the molecules were slowed in a trap, they were fired upward in such a way as to cause them to undergo changes in velocity and position. They were then ionized by a laser and measured by a detector disk.

 

The device is not yet able to offer physical constant measurements, however, because it is only able to detect a single molecule for every five repetitions of the fountain blast, which works out to less than one detection per second. This means that it will take a lot of time to gather enough information from a single fountain to make any real measurements. Fortunately, as more repetitions will produce additional data, which suggests highly precise measurements are sure to come in the near future.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists unveil a new form of matter: Time crystals

Scientists unveil a new form of matter: Time crystals | Amazing Science | Scoop.it

To most people, crystals mean diamond bling, semiprecious gems or perhaps the jagged amethyst or quartz crystals beloved by collectors. To Norman Yao, these inert crystals are the tip of the iceberg. If crystals have an atomic structure that repeats in space, like the carbon lattice of a diamond, why can't crystals also have a structure that repeats in time? That is, a time crystal?

 

In a paper published online last week in the journal Physical Review Letters, the University of California, Berkeley assistant professor of physics describes exactly how to make and measure the properties of such a crystal, and even predicts what the various phases surrounding the time crystal should be -- akin to the liquid and gas phases of ice.

 

This is not mere speculation. Two groups followed Yao's blueprint and have already created the first-ever time crystals. The groups at the University of Maryland and Harvard University reported their successes, using two totally different setups, in papers posted online last year, and have submitted the results for publication. Yao is a co-author on both papers.

 

Time crystals repeat in time because they are kicked periodically, sort of like tapping Jell-O repeatedly to get it to jiggle, Yao said. The big breakthrough, he argues, is less that these particular crystals repeat in time than that they are the first of a large class of new materials that are intrinsically out of equilibrium, unable to settle down to the motionless equilibrium of, for example, a diamond or ruby.

 

"This is a new phase of matter, period, but it is also really cool because it is one of the first examples of non-equilibrium matter," Yao said. "For the last half-century, we have been exploring equilibrium matter, like metals and insulators. We are just now starting to explore a whole new landscape of non-equilibrium matter."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists detect exotic looped trajectories of light in three-slit experiment

Physicists detect exotic looped trajectories of light in three-slit experiment | Amazing Science | Scoop.it

Physicists have performed a variation of the famous 200-year-old double-slit experiment that, for the first time, involves "exotic looped trajectories" of photons. These photons travel forward through one slit, then loop around and travel back through another slit, and then sometimes loop around again and travel forward through a third slit.

 

Interestingly, the contribution of these looped trajectories to the overall interference pattern leads to an apparent deviation from the usual form of the superposition principle. This apparent deviation can be understood as an incorrect application of the superposition principle—once the additional interference between looped and straight trajectories is accounted for, the superposition can be correctly applied.

 

The team of physicists, led by Omar S. Magaña-Loaiza and Israel De Leon, has published a paper on the new experiment in a recent issue of Nature Communications.

more...
No comment yet.