Amazing Science
Follow
Find tag "physics"
391.6K views | +135 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Lawrence Livermore scientist develops uncrackable code for nuclear weapons

Lawrence Livermore scientist develops uncrackable code for nuclear weapons | Amazing Science | Scoop.it

Mark Hart, a scientist and engineer in Lawrence Livermore National Laboratory’s (LLNL) Defense Technologies Division, has been awarded the 2015 Surety Transformation Initiative (STI) Award from the National Nuclear Security Administration’s (NNSA) Enhanced Surety Program.


The STI award aims to stimulate and encourage the development of potentially transformational nuclear weapon surety technologies and explore innovative, preferably monumental shift solutions, to unmet surety needs. 


“STI’s task is to reach beyond the traditional stockpile stewardship function of maintaining existing nuclear weapon capability in the absence of supercritical testing,” said Robert Sherman, enhanced surety federal program manager in NNSA’s Technology Maturation Division. “STI is intended not to maintain or polish ‘your grandfather’s Oldsmobile,’ but to go beyond it:  to invent devices and technologies that serve the 21st century nuclear security needs of the American people better than they are served by existing Cold War legacy technologies.”


Hart’s winning proposal is for Intrinsic Use Control (IUC), a concept that is capable of providing improved quantifiable safety and use control within a nuclear weapon. Nuclear weapons exist, therefore control is essential. Use control of a weapon is focused on providing unencumbered authorized use while restricting unauthorized use. Safety, use control and physical security work in concert for the weapon’s surety. IUC provides a less than 10-18 chance of controlling or operating an individual protected component, and a less than 10-72 chance of controlling or operating the entire protected system.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

First look at nuclear fuel in a meltdown scenario

First look at nuclear fuel in a meltdown scenario | Amazing Science | Scoop.it

Scientists have managed to take their first close-up look at what happens to nuclear fuel when it becomes molten, as it would in a nuclear reactor meltdown. In an innovative lab experiment, they discovered that uranium dioxide fuel behaves differently when molten than in its solid state.


The findings, reported in the journal Science, may help researchers improve safety at nuclear power plants, by better understanding uranium dioxide's behaviour under extreme temperatures.


"In extreme events like Fukushima and Chernobyl the uranium dioxide literally melts, and we wanted to study the material to really understand it," says the paper's lead author Dr Lawrie Skinner of Stony Brook University in New York. "We can now pin down a little bit more accurately what the properties and temperature of the melt will be. Any sensible reactor design should take into account the real structure, physical properties, and behavior of this melt."


Until now, the extreme heat and radiation has made it impossible for scientists to study uranium dioxide's characteristics and structure in a molten state. Uranium dioxide melts at over 3000°C, far too hot for most furnace container materials which would melt and react with the test samples.


Skinner and colleagues got around the container problem by floating a tiny 3 millimeter bead of uranium dioxide in a gas stream and heating it with a laser. They were able to study the relative positions of the atoms in both hot solid and molten uranium dioxide beads using high energy synchrotron X-ray diffraction. "We didn't really know what to expect, it's not something we've measured before," says Skinner.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists study magnetism with the roles of position and momentum reversed

Physicists study magnetism with the roles of position and momentum reversed | Amazing Science | Scoop.it

Normally, the strength of a magnetic field increases as you get closer to a magnet and decreases as you move further away—a concept easily understood when placing magnets near a refrigerator, for instance. But recent research has shown that exotic "momentum-space artificial magnetic fields" can be created where the strength of the magnetic field depends on how fast a particle moves, instead of where the particle is. In other words, the roles of position and momentum are swapped.


Now in a new paper, physicists have explored these ideas further, especially at the quantum level. They show how current experiments can be modified to study the motion of a quantum particle in a momentum-space magnetic field. They explain that these systems will be able to experimentally realize a "wonderland of new physics," such as magnetism on a torus, for the first time.


The physicists, Hannah M. Price, Tomoki Ozawa, and Iacopo Carusotto at the INO-CNR BEC Center and the University of Trento, Italy, have published their paper discussing momentum-space magnetism in a recent issue of Physical Review Letters.


"Magnetism is fundamental in many areas of physics, and it leads to many fascinating phenomena," Price told Phys.org. "Physicists use mathematical equations to capture the behavior of a quantum particle in a magnetic field. These equations have a particular, beautiful mathematical structure. But we can also reverse this logic. If we engineer or find an equation with this particular mathematical structure, the behavior of a particle will be like that of a particle in an 'artificial magnetic field,' even if the 'field' has a completely different underlying physical origin. As has been known for a long time, this beautiful mathematical structure can be found or created in many different physical contexts. This is a really powerful tool that physicists use to engineer and learn more about magnetism."


"We can either view our problem in terms of all the possible 'position states' of a particle or equally in terms of all the possible 'momentum states,'" Price explained. "Depending on the viewpoint we choose, our mathematical equations will have a different form, and so we usually choose the viewpoint that gives us the easiest equations to solve and understand. The beautiful mathematical structure described above appears when we have a magnetic field and we look at our equations in terms of 'position states.' "However, what if we could find equations in terms of 'momentum states' that had an analogous mathematical structure? Then we could draw an analogy with magnetism: we would get the same quantum physics but where the 'position' must be swapped everywhere with the 'momentum.' "The equations could be understood as describing a particle in an 'artificial momentum-space magnetic field.'"

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CERN: LHCb experiment observes two new baryon particles never seen before

CERN: LHCb experiment observes two new baryon particles never seen before | Amazing Science | Scoop.it

Today the collaboration for the LHCb experiment at CERN’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b'- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*0, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.


Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new Xib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called "spin". In the Xi_b'- state, the spins of the two lighter quarks point in opposite directions, whereas in the Xi_b*- state they are aligned. This difference makes the Xi_b*- a little heavier.


"Nature was kind and gave us two particles for the price of one," said Matthew Charles of the CNRS's LPNHE laboratory at Paris VI University. "The Xi_b'- is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn't have seen it at all using the decay signature that we were looking for.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Seaborgium Hexacarbonyl Sg(CO)6: First Carbonyl Complex of a Superheavy Element

Seaborgium Hexacarbonyl Sg(CO)6: First Carbonyl Complex of a Superheavy Element | Amazing Science | Scoop.it

Atoms with the same number of protons belong to the same element. Atomic nuclei with the same number of protons and different numbers of neutrons are called isotopes. The elements up to uranium (element 92) exist in nature (except for technetium ). The elements heavier than uranium are man-made. All elements are arranged in the periodic table of the elements. Their positions in the periodic table correspond to their proton number; elements in the same column (i.e., in the same group) feature similar and electronic shell structure, which characterizes the chemical behavior of an element. An element's position in the periodic table and thus provides information on its chemical behavior, e.g., as a metal or an inert gas.


If atomic nuclei have too many protons (all of which repel each other) or have an this ratio is unfavorable proton to neutron ratio, the nuclei are not stable but undergo radioactive decay. The elements up to the element fermium (which has atomic number 100) can be produced at research reactors by irradiating a target of a heavy element with neutrons. The target atoms capture a neutrons and subsequently decay through β--emission, thus forming an element with the next higher proton number. This process can be repated, up to fermium.

As there are no isotopes of fermium which decay through β--emission, no elements with higher proton number can be synthesized by this method.


The heavier an atom is, the more protons are contained in its nucleus. With increasing proton number, the repulsive force of these protons will eventually lead to immediate disintegration of the nucleus. The elements with a proton number higher than 103 can only exist due to nuclear shell effects and are called the superheavy elements. A topic of intense research concerns the question of the heaviest possible element. To date, all elements up to element 112 as well as elements 114 and 116 are officially recognized as discovered, and reports about the observation also of element 113,115117, and 118 are also published. It is currently not clear, which element is the heaviest one that can exist.


The production of 265Sg and its separation in GARIS was perfected in preparatory work led by Dr. Hiromitsu Haba from RIKEN Nishina Center (RNC) and his team. In this nuclear reaction, a few Sg atoms per hour can be produced.


Seaborgium hexacarbonyl – Why is it so special?


Carbon monoxide (CO) is known to form complexes with many transition metals. In 1890, Ludwig Mond, Carl Langer and Friedrich Quincke reported of the first synthesis of a carbonyl complex – nickeltetracarbonyl ( Ni(CO)4; see here). In this compound, the nickel (Ni) atom is surrounded by 4 carbon monoxide molecules (CO).


In this type of molecule, coordination bonds (rather than covalent bonds) form between the metal and the carbon monoxide.


The carbon monoxide ligands bind to the metal by forming a so-called σ-donation bond, and a π-backbond from the metal to the carbon monoxide ligand establishes. In the σ-donation bond the highest occupied molecular orbital (HOMO) of the CO donates electron density into the σ-symmetric orbitals of the metal (s or p1/2 or dz2 orbitals). In the π-backbonding, electron density for the π-symmetric d-orbital is donated to the lowest unoccupied molecular orbital (LUMO) of the CO-ligand. The σ-donation bond is the strongest bond, while the π-backbond is slightly weaker.


Synthesis of carbonyl complexes with fusion products directly behind the target in a CO-containing atomosphere is not possible, as the primary beam would pass the gas and create a plasma. This would destroy the CO molecules. Therefore, only our new approach to perform chemical experiments behind a separator like TASCA or GARIS allows the synthesis and study of this compound class.


Chemistry experiments with superheavy elements -  with periodic numbers higher than 104 – are difficult to perform. First, scientists have to produce the element artificially in a particle accelerator. The production rates are really low, usually lower than a few atoms per day. Furthermore, these atoms are very instable, and survive in the best case less than 10 seconds. However, science is still very interested to investigate the characteristics of these superheavy elements, since they allow to test the influence of Einstein's relativity theory on chemistry. The high number of positively charged protons in the atomic nucleus of superheavy elements accelerate the electrons in the different shells to extremely high velocities - close to 80% of the speed of light. Due to the relativistic effects at these speeds, electrons are much heavier than when they are at rest, which in turn should have some influence on the chemical properties of the superheavy atom. These effects will be compared with elements that possess a similar atomic structure but are lighter. Such studies will be of enormous interest to all basic chemists in the world.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

'Quantum reporters' measure the magnetic resonance of a single proton

'Quantum reporters' measure the magnetic resonance of a single proton | Amazing Science | Scoop.it

The positions of individual protons on a surface can be pinned down to within 0.1 nm, thanks to a new quantum technique based on nuclear magnetic resonance (NMR) developed by researchers in the US. The method, which works at room temperature, uses an effect that is usually considered a nuisance because it degrades the performance of diamond-based quantum bits (qubits). The researchers say that the technique could be used to study individual proteins or even spins in a superconductor.


At the heart of the new method are crystal defects that occur in diamond when two adjacent carbon atoms are replaced by a nitrogen atom and a vacant site. These "nitrogen-vacancy" (NV) centres have an electronic spin that is very well isolated from its surroundings, which means that they could play a key role in future quantum computers. And because an NV centre can emit just a single photon if excited by a laser, quantum information could be stored for long times in this kind of defect before being read out as a photon.


While NV centres that lie deep within a diamond are well-isolated, those within a few nanometres of the surface interact strongly with electron spins on the surface. Such centres would not, therefore, be used to make a quantum computer, but physicists have used them to study the properties of electrons on the surface of diamond. Two independent groups have also used NV centres to do NMR studies of molecules on the surface of diamond (see "Diamond downsizes classical MRI and NMR").


Now, Alex Sushkov and colleagues at Harvard University have developed a new NMR technique that uses the surface electrons as "quantum reporters" to measure the positions of individual protons on the surface of diamond. The first step involves mapping the locations of surface spins that are within a few nanometres of a NV centre, by applying a magnetic field to the diamond and then firing a sequence of radio-frequency pulses (RF) at the sample. Known as double electron–electron resonance (DEER), this is an established technique used to measure the distance between electrons in a molecule. Information is extracted from the system by measuring the final spin state of the NV centre by observing the fluorescent light it emits. By repeating the measurement with the magnetic field in different directions, the team can map the locations of the surface spins nearest to the NV centre (see figure above).


In an experiment reported in Physical Review Letters, the team was able to locate four surface spins that were within several nanometres of an NV centre, which itself was about 3 nm below the diamond's surface. The team then focused its efforts on the spin nearest to the NV centre. Using an applied magnetic field and a different sequence of RF pulses, the researchers were able to make a "spin-echo" measurement of the magnetic field near that single "reporter" spin. This measurement is affected by the presence of the nuclear magnetic moments of nearby protons that happen to be stuck to the diamond surface. Two protons were seen to be near to the reporter spin and, by careful analysis of the spin-echo data, the team was able to determine the locations of the protons to within 0.1 nm. This distance is on a par with the spacing between atoms in molecules and solids.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Disproving the Peres conjecture finally solves one of the most famous problems in quantum information physics

Disproving the Peres conjecture finally solves one of the most famous problems in quantum information physics | Amazing Science | Scoop.it

Since 1999, the conjecture by Asher Peres, who invented quantum teleportation, has piqued the interest of many scientists in the field. According to his hypothesis, the weakest form of quantum entanglement can never result in the strongest manifestation of the phenomenon. Today, scientists have proven this conjecture to be false, thus solving one of the most famous problems in quantum information physics. This news was published in Nature Communications review.


The physicist Asher Peres was very interested in the phenomenon of quantum entanglement and its different manifestations. When two objects (take photons, for example) are entangled, they remain correlated regardless of the distance that separates them physically: whether they are separated by a millimeter or by several kilometers, any action done to one of them will immediately affect the other. To check whether a system is entangled, scientists test for Bell's inequality. If the experimental measurements violate Bell's inequality, this means that the two objects are entangled, and that they correspond to two manifestations, in different locations, of the same single object. This is called nonlocality.


In 1999, Asher Peres conjectured that the weakest form of an entanglement will never result in the strongest manifestation of the phenomenon. The violation of Bell's inequality represents the strongest form of entanglement. Two objects must indeed be strongly entangled in order for the system's experimental measurements to violate Bell's inequality. On the other hand, there also exist states with very weak entanglement. Asher Peres wondered if it would be possible to distil several weakly entangled states in order to make a strongly entangled one, as one would distil alcohol. The theory showed that this was possible, but not in every case. Certain states are in fact too weakly entangled to be distilled; this is the case of bound entanglement, which is considered the weakest form of the phenomenon. Peres therefore concluded that the weakest form of entanglement could never result in the strongest manifestation of the phenomenon, namely nonlocality.


Later, a number of scientists tried to prove his conjecture. Some succeeded in a few particular cases, but none were able to demonstrate the claim in general. Peres's conjecture was therefore considered to be one of the most famous unresolved problems in the field of quantum information physics... until now. In fact, Nicolas Brunner, a physics Professor at UNIGE's Faculty of science, and Tamas Vertesi, a researcher at the Hungarian Academy of Sciences, were able to disprove Peres's conjecture. "To do so, we just had to find a counter-example," explains Professor Brunner. "Using numerical algorithms, we showed that a bound entanglement can violate Bell's inequality, without needing to be distilled."


Reference:

  1. Tamás Vértesi, Nicolas Brunner. Disproving the Peres conjecture by showing Bell nonlocality from bound entanglementNature Communications, 2014; 5: 5297 DOI: 10.1038/ncomms6297
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Method for symmetry-breaking in feedback-driven self-assembly of optical metamaterials

Method for symmetry-breaking in feedback-driven self-assembly of optical metamaterials | Amazing Science | Scoop.it

If you can uniformly break the symmetry of nanorod pairs in a colloidal solution, you're a step ahead of the game toward achieving new and exciting metamaterial properties. But traditional thermodynamic -driven colloidal assembly of these metamaterials, which are materials defined by their non-naturally-occurring properties, often result in structures with high degree of symmetries in the bulk material. In this case, the energy requirement does not allow the structure to break its symmetry.


In a study led by Xiang Zhang, director of Berkeley Lab's Materials Sciences Division, he and his research group at the University of California (UC) Berkeley achieved symmetry-breaking in a bulk metamaterial solution for the first time. Zhang and his group demonstrated self-assembled optical metamaterials with tailored broken-symmetries and hence unique electromagnetic responses that can be achieved via their new method. The results have been published in Nature Nanotechnology. The paper is titled "Feedback-driven self-assembly of symmetry-breaking optical metamaterials in solution."


"We developed an innovative self-assembly route which could surpass the conventional thermodynamic limit in chemical synthetic systems" explains Sui Yang, lead author of the Nature Nanotechnology paper and member of Zhang's research group. "Specifically, we use the material's own property as a self-correction feedback mechanism to self-determine the final structure."


This led the group to produce nanostructures that have historically been considered impossible to assemble. The widely used method of metamaterial synthesis is top-down fabrication such as electron beam or focus ion beam lithography that often results in strongly anisotropic and small-scale metamaterials.


"People build metamaterials using top-down methods that include light exposure and electron beam exposure, which are inefficient and costly," says Xingjie Ni, another lead author on the paper. "If we want to use metamaterials, we need to develop a way to build them cheaply and efficiently."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Does the Arrow of Time Self-Emerge in a Gravitational System?

Does the Arrow of Time Self-Emerge in a Gravitational System? | Amazing Science | Scoop.it
Study of masses interacting via gravity challenges the idea that special initial conditions are needed to give time a direction.


The fundamental laws of physics, we believe, do not depend on the direction of time. Why, then, is the future so different from the past? The origin of this “arrow of time” has puzzled physicists and philosophers for more than a century, and it remains one of the fundamental conceptual problems of modern physics [1]. Although a preferred direction of time can occur in models of physical systems, this typically happens only if one inserts very special initial conditions.


Julian Barbour at the University of Oxford and his colleagues [2] have now shown this tinkering isn’t necessary to produce an arrow of time in a system of masses interacting via Newtonian gravity. They demonstrate that the evolution of this surprisingly simple system almost always contains a unique moment of lowest “complexity,” a point they identify as a “past” from which two distinct (and more complex) “futures” emerge.


The work of Barbour and his colleagues is the latest in a long history of attempts to explain the arrow of time. One possibility, of course, is that we don’t know the right laws of physics—perhaps the correct fundamental laws do determine a preferred direction of time [3]. Alternatively, if the laws of nature do not pick out a preferred “future,” perhaps boundary conditions do. For example, most cosmological models assume, explicitly or implicitly, that the big bang was a moment of exceptionally low entropy.


Indeed, most physicists accept the view that the direction of time is the same as the direction of increasing entropy. But this is, at best, an incomplete picture, failing to explain why there should have been a rare condition of low entropy in the past. More than a century ago, Boltzmann suggested that our visible Universe might merely be a temporary, low-entropy statistical fluctuation, affecting a small portion of a much larger equilibrium system [4]. In that case, the direction of time would simply be the one that takes us back towards equilibrium. But most contemporary physicists find this explanation unsatisfying: a random fluctuation containing “us” would have been far more likely to produce a single galaxy, a planet, or just a “brain” rather than a whole universe [56]. Moreover, according to the “Loschmidt irreversibility paradox,” if one posits such a moment of low entropy, entropy should increase both to the future and to the past, giving two separate arrows of time [7].

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Electron wave function is split by tunnelling into different regions

Electron wave function is split by tunnelling into different regions | Amazing Science | Scoop.it

New research by physicists from Brown University puts the profound strangeness of quantum mechanics in a nutshell—or, more accurately, in a helium bubble.

Experiments led by Humphrey Maris, professor of physics at Brown, suggest that the quantum state of an electron—the electron's wave function—can be shattered into pieces and those pieces can be trapped in tiny bubbles of liquid helium. To be clear, the researchers are not saying that the electron can be broken apart. Electrons are elementary particles, indivisible and unbreakable. But what the researchers are saying is in some ways more bizarre.


In quantum mechanics, particles do not have a distinct position in space. Instead, they exist as a wave function, a probability distribution that includes all the possible locations where a particle might be found. Maris and his colleagues are suggesting that parts of that distribution can be separated and cordoned off from each other.


"We are trapping the chance of finding the electron, not pieces of the electron," Maris said. "It's a little like a lottery. When lottery tickets are sold, everyone who buys a ticket gets a piece of paper. So all these people are holding a chance and you can consider that the chances are spread all over the place. But there is only one prize—one electron—and where that prize will go is determined later."


If Maris's interpretation of his experimental findings is correct, it raises profound questions about the measurement process in quantum mechanics. In the traditional formulation of quantum mechanics, when a particle is measured—meaning it is found to be in one particular location—the wave function is said to collapse.


"The experiments we have performed indicate that the mere interaction of an electron with some larger physical system, such as a bath of liquid helium, does not constitute a measurement," Maris said. "The question then is: What does?"

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Turning loss into gain: Cutting power could dramatically boost laser output

Turning loss into gain: Cutting power could dramatically boost laser output | Amazing Science | Scoop.it
Lasers – devices that deliver beams of highly organized light – are so deeply integrated into modern technology that their basic operations would seem well understood. CD players, medical diagnostics and military surveillance all depend on lasers.



Re-examining longstanding beliefs about the physics of these devices, Princeton engineers have now shown that carefully restricting the delivery of power to certain areas within a laser could boost its output by many orders of magnitude. The finding, published Oct. 26 in the journal Nature Photonics, could allow far more sensitive and energy-efficient lasers, as well as potentially more control over the frequencies and spatial pattern of light emission.


"It's as though you are using loss to your advantage," said graduate student Omer Malik, an author of the study along with Li Ge, now an assistant professor at the City University of New York, and Hakan Tureci, assistant professor of electrical engineering at Princeton. The researchers said that restricting the delivery of power causes much of the physical space within a laser to absorb rather than produce light. In exchange, however, the optimally efficient portion of the laser is freed from competition with less efficient portions and shines forth far more brightly than previous estimates had suggested.


The results, based on mathematical calculations and computer simulations, still need to be verified in experiments with actual lasers, but the researchers said it represents a new understanding of the fundamental processes that govern how lasers produce light.

"Distributing gain and loss within the material is a higher level of design – a new tool – that had not been used very systematically until now," Tureci said.


The heart of a laser is a material that emits light when energy is supplied to it. When a low level of energy is added, the light is "incoherent," essentially meaning that it contains a mix of wavelengths (or colors). As more energy is added, the material suddenly reaches a "lasing" threshold when it emits coherent light of a particular wavelength.


The entire surface of the material does not emit laser light; rather, if the material is arranged as a disc, for example, the light might come from a ring close to the edge. As even more energy is added, more patterns emerge – for example a ring closer to the center might reach the laser threshold. These patterns – called modes – begin to interact and sap energy from each other. Because of this competition, subsequent modes requiring higher energy may never reach their lasing thresholds. However, Tureci's research group found that some of these higher threshold modes were potentially far more efficient than the earlier ones if they could just be allowed to function without competition.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Faster switching helps ferroelectrics become viable replacement for transistors

Faster switching helps ferroelectrics become viable replacement for transistors | Amazing Science | Scoop.it

Ferroelectric materials – commonly used in transit cards, gas grill igniters, video game memory and more – could become strong candidates for use in next-generation computers, thanks to new research led by scientists at the University of California, Berkeley, and the University of Pennsylvania.


The researchers found an easy way to improve the performance of ferroelectric materials in a way that makes them viable candidates for low-power computing and electronics. They described their work in a study published today (Sunday, Oct. 26) in the journal Nature Materials.


Ferroelectric materials have spontaneous polarization as a result of small shifts of negative and positive charges within the material. A key characteristic of these materials is that the polarization can be reversed in response to an electric field, enabling the creation of a “0” or “1” data bit for memory applications. Ferroelectrics can also produce an electric charge in response to physical force, such as being pressed, squeezed or stretched, which is why they are found in applications such as push-button igniters on portable gas grills.


“What we discovered was a fundamentally new and unexpected way for these ferroelectric materials to respond to applied electric fields,” said study principal investigator Lane Martin, UC Berkeley associate professor of materials science and engineering. “Our discovery opens up the possibility for faster switching and new control over novel, never-before-expected multi-state devices.”


Martin and other UC Berkeley researchers partnered with a team led by Andrew Rappe, University of Pennsylvania professor of chemistry and of materials science and engineering. UC Berkeley graduate student Ruijuan Xu led the study’s experimental design, and Penn graduate student Shi Liu led the study’s theoretical modeling.


Scientists have turned to ferroelectrics as an alternative form of data storage and memory because the material holds a number of advantages over conventional semiconductors. For example, anyone who has ever lost unsaved computer data after power is unexpectedly interrupted knows that today’s transistors need electricity to maintain their “on” or “off” state in an electronic circuit.


Because ferroelectrics are non-volatile, they can remain in one polarized state or another without power. This ability of ferroelectric materials to store memory without continuous power makes them useful for transit cards, such as the Clipper cards used to pay fare in the Bay Area, and in certain memory cards for consumer electronics. If used in next-generation computers, ferroelectrics would enable the retention of information so that data would be there if electricity goes out and then is restored.


“If we could integrate these materials into the next generation of computers, people wouldn’t lose their data if the power goes off,” said Martin, who is also a faculty scientist at the Lawrence Berkeley National Laboratory. “For an individual, losing unsaved work is an inconvenience, but for large companies like eBay, Google and Amazon, losing data is a significant loss of revenue.”


So what has held ferroelectrics back from wider use as on/off switches in integrated circuits? The answer is speed, according to the study authors.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

6 mKelvin: The drive to create the coldest cubic meter in the universe

6 mKelvin: The drive to create the coldest cubic meter in the universe | Amazing Science | Scoop.it

An international team of scientists recently set a world record by cooling a copper vessel with a volume of a cubic meter down to a temperature of 6 milliKelvins—or -273.144 degrees Celsius. It was the first experiment to chill an object so large this close to absolute zero.


The collaboration, called CUORE (Cryogenic Underground Observatory for Rare Events), involves 130 scientists from the United States, Italy, China, Spain, France, and other countries. It is based at the underground Gran Sasso National Laboratory of the Instituto Nazionale di Fisica Nucleare, in Italy.


"This is a major technological achievement," said Karsten Heeger, a professor of physics at Yale and director of Yale's Arthur W. Wright Laboratory. CUORE is part of the new experimental program in neutrinos and dark matter pursued at the Wright Lab.


Yale physicists are building and testing instrumentation that will be used at temperatures of 10mK in the experiment's cryostat, which is the chilled chamber. Reina Maruyama, an assistant professor of physics, is one of the original proponents for the US involvement in CUORE and is a coordinator of its data analysis


"In collaboration with the University of Wisconsin, we have developed a detector calibration system that will deploy radioactive sources into the coldest region of the cryostat and characterize our detectors," Heeger said.


Once the CUORE experiment is fully operational, it will study important properties of neutrinos, the fundamental, subatomic particles that are created by radioactive decay and do not carry an electrical charge.


Specifically, the experiment will look at a rare process called neutrinoless double-beta decay. The detection of this process would let researchers demonstrate, for the first time, that neutrinos and antineutrinos are the same—thereby offering a possible explanation for the abundance of matter, rather than anti-matter, in the universe.


The experiment uses heat-sensitive detectors that operate in extremely cold temperatures. "It poses a unique challenge," Heeger said. "We are trying to detect a minuscule amount of heat from nuclear decay, but need to know this very precisely. The detector calibration will tell us if we see the heat from double-beta decay or environmental backgrounds."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Trapped calcium ion qbits with an error rate of just 0.07% after writing and reading the bit 150,000 times

Trapped calcium ion qbits with an error rate of just 0.07% after writing and reading the bit 150,000 times | Amazing Science | Scoop.it
Qubits based on trapped ions can be prepared and manipulated with record-breaking accuracy, offering a promising scalable platform for quantum computing.


The realization, two decades ago, that quantum mechanics can be a powerful resource to speed up important computational tasks [1] led to intense research efforts to find adequate physical systems for quantum computation. One of the hurdles to a viable technology is the requirement to prepare, manipulate, and measure quantum bits (qubits) with near perfect accuracy: Imperfect control leads to errors that can accumulate over the computation process. Techniques like quantum error correction and fault-tolerant designs can, in principle, overcome these errors. But these strategies can be successful only if the error probabilities are lower than a threshold value. They also increase the complexity of the required quantum hardware, since they require additional qubits. Recent calculations [2] suggest that an error probability of less than 1% would enable fault-tolerant codes, and that lower error probabilities dramatically decrease the number of qubits required for such codes.


The quality of qubit manipulation in a number of physical systems has dramatically improved in the past few years [34], raising hopes that a quantum computer, at a large enough scale to carry out meaningful computations, might be within reach. Now, Thomas Harty at the University of Oxford, UK, and colleagues [5] are reporting an important contribution to this goal with the demonstration that qubits consisting of trapped 43Ca+ions can be manipulated with record high fidelities (in quantum information theory, fidelity is a measure of the “closeness” of two quantum states). Their experiments suggest trapped-ion schemes could potentially provide the basic fundamental building blocks of a universal quantum computer.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CERN makes its data from real collision events available to the public for the first time

CERN makes its data from real collision events available to the public for the first time | Amazing Science | Scoop.it

Today CERN launched its Open Data Portal, which makes data from real collision events produced by LHC experiments available to the public for the first time. “Data from the LHC program are among the most precious assets of the LHC experiments, that today we start sharing openly with the world,” says CERN Director General Rolf Heuer. “We hope these open data will support and inspire the global research community, including students and citizen scientists.”


The LHC collaborations will continue to release collision data over the coming years. The first high-level and analyzable collision data openly released come from the CMS experiment and were originally collected in 2010 during the first LHC run. Open source software to read and analyze the data is also available, together with the corresponding documentation. The CMS collaboration is committed to releasing its data three years after collection, after they have been thoroughly studied by the collaboration.


“This is all new and we are curious to see how the data will be re-used,” says CMS data preservation coordinator Kati Lassila-Perini. “We’ve prepared tools and examples of different levels of complexity from simplified analysis to ready-to-use online applications. We hope these examples will stimulate the creativity of external users.”

In parallel, the CERN Open Data Portal gives access to additional event data sets from the ALICE, ATLAS, CMS and LHCb collaborations that have been prepared for educational purposes. These resources are accompanied by visualization tools.


All data on OpenData.cern.ch are shared under a Creative Commons CC0 public domain dedication. Data and software are assigned unique DOI identifiers to make them citable in scientific articles. And software is released under open source licenses. The CERN Open Data Portal is built on the open-source Invenio Digital Library software, which powers other CERN Open Science tools and initiatives.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Researchers create 3-D stereoscopic plasmonic color prints with nanopixels

Researchers create 3-D stereoscopic plasmonic color prints with nanopixels | Amazing Science | Scoop.it

By designing nanopixels that encode two sets of information—or colors of light—within the same pixel, researchers have developed a new method for making 3D color prints. Each pixel can exhibit one of two colors depending on the polarization of the light used to illuminate it. So by viewing the pixels under light of both polarizations, two separate images can be seen. If the two images are chosen to be slightly displaced views of the same scene, viewing both simultaneously results in depth perception and the impression of a 3D stereoscopic image.


The researchers, led by Professor Joel K.W. Yang, at A*STAR (the Agency for Science, Technology and Research) in Singapore, the National University of Singapore, and the Singapore University of Technology and Design, have published a paper on the new technique for realizing 3D full-color stereoscopic prints in a recent issue of Nature Communications.


"We have created possibly the smallest-ever stereoscopic images using pixels formed from plasmonic nanostructures," Yang told Phys.org. "Such stereoscopic images do not require the viewer to don special glasses, but instead, the depth perception and 3D effect is created simply by viewing the print through an optical microscope coupled with polarizers."


The work is based on the concept of surface plasmon resonance: metal nanostructures can scatter different wavelengths (colors) of light due to the fact that the tiny nanostructures themselves resonate at different wavelengths. If a nanostructure is circular, its resonance is polarization-independent because the diameter of the circle is the same from all directions. However, if a nanostructure is biaxial (such as an ellipse or rectangle), its resonance will depend on the polarization of the incident light. By tailoring the exact dimensions of the biaxial nanopixels, researchers can generate different colors under different polarizations.


Building on these ideas, the researchers in the current study have demonstrated that polarization-sensitive nanopixels that encode two sets of information can be used to produce 3D stereoscopic microprints. To do this, the researchers created nanopixels out of tiny pieces of aluminum a hundred or so nanometers across. Because these shapes are biaxial, they exhibit plasmonic resonances at different wavelengths for each axis, with the colors determined almost entirely by the dimension of the axis parallel to the polarization direction. For example, a 130-nm x 190-nm elliptical pixel appears green under y-polarized light and purple under x-polarized light. Comparing the two pixel shapes, the researchers found that the elliptical pixels have a broader range of polarization-dependent colors, while the nanosquare dimer pixels have lower levels of cross-talk, minimizing unwanted mixing of colors.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Spiral laser beam used to create a whirlpool of hybrid light-matter particles called polaritons

Spiral laser beam used to create a whirlpool of hybrid light-matter particles called polaritons | Amazing Science | Scoop.it

Physicists at Australian National University have engineered a spiral laser beam and used it to create a whirlpool of hybrid light-matter particles called polaritons.  "Creating circulating currents of polaritons – vortices – and controlling them has been a long-standing challenge," said leader of the team, theoretician Dr Elena Ostrovskaya, from the Research School of Physics and Engineering. "We can now create a circulating flow of these hybrid particles and sustain it for hours."


Polaritons are hybrid particles that have properties of both matter and light. The ability to control polariton flows in this way could aid the development of completely novel technology to link conventional electronics with new laser and fibre-based technologies.


Polaritons form in semiconductors when laser light interacts with electrons and holes (positively charged vacancies) so strongly that it is no longer possible to distinguish light from matter.


The team created the spiral beam by putting their laser through a piece of brass with a spiral pattern of holes in it. This was directed into a semiconductor microcavity, a tiny wafer of aluminium gallium arsenide, a material used in LEDs, sandwiched between two reflectors. "The vortices have previously only appeared randomly, and always in pairs that swirl in opposite directions," said Dr Robert Dall, who led the experimental part of the project. "However, by using a spiral mask to structure our laser, we create a chiral system that prefers one flow direction. Therefore we can create a single, stable vortex at will."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Physicists sent a beam of twisted light 3 km through the air above Vienna

Physicists sent a beam of twisted light 3 km through the air above Vienna | Amazing Science | Scoop.it

It is the first time that information has been transmitted outdoors using the "twist" of a visible light beam. This twisting property could allow very fast communication because light with different amounts of twist, encoding separate channels of information, could be sent simultaneously.


Reported in the New Journal of Physics, the technique was tested by sending three images of famous Austrians. The images were black-and-white portraits of the physicists Ludwig Boltzmann and Erwin Schroedinger, and composer Wolfgang Amadeus Mozart, which were transmitted with an error rate of only 1.7%.


Above the rooftops of Mozart's own city, where the only long-range signal known to the famous composer would have been church bells, his portrait was broken down into pixels and travelled through the night inside a green laser beam. The twisting of light, technically described as its "orbital angular momentum" (OAM), was first demonstrated in the 1990s and so would probably have surprised the two famous physicists as well.


Rather than polarised light waves, which are restricted in the directions that they can "wiggle", light with this type of momentum twists through space like a corkscrew. In terms of individual photons of light, it means that instead of spinning like the Earth around its own axis, their energy traces out a spiral. It is the same sort of momentum that sees the Earth orbit the sun, but the photons are also moving forward at the speed of light. That corkscrew-like motion is useful because instead of just having two possible directions like polarization (clockwise or anti-clockwise), it can turn in either direction with a potentially infinite number of twists - much like a screw with multiple threads. This is why physicists have been investigating whether twisted light could help transmit information very quickly: each twist configuration could be its own channel, just like different colors of light inside an optical fiber.


In the new study, however, there were no cables. Researchers from the University of Vienna set up a green laser at the window of a tower, and shone it onto a spatial light modulator. This gadget, which consists of a specially controlled liquid crystal display (LCD), put two different twists on the light that it reflected and sent across the city. "We didn't directly use the OAM itself, but a superposition of two angular momentums, which go in opposite directions," said lead author Mario Krenn, a PhD student at the university's Institute for Quantum Optics and Quantum Information.


Reference:


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Cheaper, more compact particle accelerators are a step closer

Cheaper, more compact particle accelerators are a step closer | Amazing Science | Scoop.it

Scientists working on an experiment at the SLAC National Accelerator Laboratory in the US have taken a step forward in developing a technology which could significantly reduce the size of particle accelerators. The technology is able to accelerate particles more rapidly than conventional accelerators at a much smaller size.


One of the most impressive aspects of particle accelerators used for research such as the Large Hadron Collider (LHC) at CERN is its physical size. Yet, even with a circumference of 27km, the LHC would be smaller than most of the next generation of proposed colliders. For example the International Linear Collider (ILC), a possible future collider of electrons and positrons (anti-electrons) could be 31km long, and there is even a proposal for a circular accelerator with an 80km circumference that could be built at CERN as part of the Future Circular Colliders (FCC) project.


The size of all of these machines is determined by our ability to build structures that can transfer energy to particles allowing us to accelerate them to greater speeds. The higher the speed, the greater the energy when these particle beams collide, giving scientists a better chance of answering fundamental questions about the universe. This is because higher energy collisions can create conditions that are similar to those existing when the universe was born.


Most current accelerators use a structure called an “rf cavity”, a carefully designed “box” through which the particle beam passes. The cavity transfers electromagnetic energy into the kinetic energy of particles, accelerating them. However, there is a limit to the amount of energy that an rf cavity can transfer to particles. This is because, despite operating in a vacuum, there is a risk that increasing electromagnetic fields can lead to lightning-like discharges of energy.

However, even routine experiments in places like the LHC require more energy than a single rf cavity can provide. That is why the current solution is to use very many cavities arranged in a straight line, if it is a linear machine such as the SLAC, or using the same cavity very many times if it is in a circular machine, such as the LHC.


Either solution presents challenges and requires a large machine to fit in the many parts needed. This raises the costs. Any technology which can increase the acceleration with smaller parts and without the need for more machinery will make future accelerators more compact.


This matters because particle accelerators are not just for particle physicists. They are increasingly used in medicine, industry and security. For example, accelerators provide X-rays and particle beams for cancer therapy, for the fabrication of minuscule devices and for scanning the contents of everything from suitcases to freight containers.


The new technology which could promise more compact particle accelerators has just been published in a study in Nature. The study suggests that, if bunches of electrons are passed through a short column of lithium vapour “plasma” in rapid succession, the electric field of the plasma is able to translate enough energy to accelerate particles hundreds of times quicker than the LHC. It is able to achieve all this while only being 30cm in length.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

String field theory could be the foundation of quantum mechanics

String field theory could be the foundation of quantum mechanics | Amazing Science | Scoop.it
Two USC researchers have proposed a link between string field theory and quantum mechanics that could open the door to using string field theory—or a broader version of it, called M-theory—as the basis of all physics.


"This could solve the mystery of where quantum mechanics comes from," said Itzhak Bars, USC Dornsife College of Letters, Arts and Sciences professor and lead author of the paper. Bars collaborated with Dmitry Rychkov, his Ph.D. student at USC. The paper was published online on Oct. 27 by the journal Physics Letters.


Rather than use quantum mechanics to validate string field theory, the researchers worked backwards and used string field theory to try to validate quantum mechanics.


In their paper, which reformulated string field theory in a clearer language, Bars and Rychov showed that a set of fundamental quantum mechanical principles known as "commutation rules'' may be derived from the geometry of strings joining and splitting.


"Our argument can be presented in bare bones in a hugely simplified mathematical structure," Bars said. "The essential ingredient is the assumption that all matter is made up of strings and that the only possible interaction is joining/splitting as specified in their version of string field theory."


Physicists have long sought to unite quantum mechanics and general relativity, and to explain why both work in their respective domains. First proposed in the 1970s, string theory resolved inconsistencies of quantum gravity and suggested that the fundamental unit of matter was a tiny string, not a point, and that the only possible interactions of matter are strings either joining or splitting. Four decades later, physicists are still trying to hash out the rules of string theory, which seem to demand some interesting starting conditions to work, like extra dimensions, which may explain why quarks and leptons have electric charge, color and "flavor" that distinguish them from one another.


At present, no single set of rules can be used to explain all of the physical interactions that occur in the observable universe.

Read more at: http://phys.org/news/2014-11-field-theory-foundation-quantum-mechanics.html#jCp

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Step towards photon-based scalable quantum logics: Entanglement of initially uncorrelated incident photons

Step towards photon-based scalable quantum logics: Entanglement of initially uncorrelated incident photons | Amazing Science | Scoop.it

Realizing a strong interaction between individual photons is an important objective of research in quantum science and technology. It requires an optical medium in which light experiences a phase shift that depends nonlinearly on the photon number. Once the additional two-photon phase shift reaches π, such an ultra-strong nonlinearity could enable the implementation of high-fidelity quantum logic operations. However, the nonlinear response of standard optical media is orders of magnitude too weak. A team of scientists now demonstrate a fiber-based nonlinearity that realizes an additional two-photon phase shift close to the ideal value of π. They employed a whispering-gallery-mode resonator, interfaced by an optical nanofiber, where the presence of a single rubidium atom in the resonator mode results in a strongly nonlinear response. They were able to show that this results in entanglement of initially uncorrelated incident photons. This demonstration of a fiber-integrated, ultra-strong nonlinearity is a decisive step towards photon-based scalable quantum logics.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

See me here, see me there: A quantum world arising from many ordinary ones

See me here, see me there: A quantum world arising from many ordinary ones | Amazing Science | Scoop.it

The bizarre behavior of the quantum world — with objects existing in two places simultaneously and light behaving as either waves or particles — could result from interactions between many 'parallel' everyday worlds, a new theory suggests.


“It is a fundamental shift from previous quantum interpretations,” says Howard Wiseman, a theoretical quantum physicist at Griffith University in Brisbane, Australia, who together with his colleagues describes the idea in Physical Review X1.


Theorists have tried to explain quantum behavior through various mathematical frameworks. One of the older interpretations envisages the classical world as stemming from the existence of many simultaneous quantum ones. But that ‘many worlds’ approach, pioneered by the US theorist Hugh Everett III in the 1950s, relies on the worlds branching out independently from one another, and not interacting at all (see 'Many worlds: See me here, see me there').


By contrast, Wiseman’s team envisages many worlds bumping into one another, calling it the 'many interacting worlds' approach. On its own, each world is ruled by classical Newtonian physics. But together, the interacting motion of these worlds gives rise to phenomena that physicists typically ascribe to the quantum world.


The authors work through the mathematics of how that interaction could produce quantum phenomena. For instance, one well-known example of quantum behaviour is when particles are able to tunnel through an energetic barrier that in a classical world they would not be able to overcome on their own. Wiseman says that, in his scenario, as two classical worlds approach an energetic barrier from either side, one of them will increase in speed while the other will bounce back. The leading world will thus pop through the seemingly insurmountable barrier, just as particles do in quantum tunneling.


But much work remains. “By no means have we answered all the questions that such a shift entails,” says Wiseman. Among other things, he and his collaborators have yet to overcome challenges such as explaining how their many-interacting-worlds theory could explain quantum entanglement, a phenomenon in which particles separated by a distance are still linked in terms of their properties.

more...
Carlos Garcia Pando's comment, October 31, 2014 5:25 AM
I think entanglement is a consequence of two simple universes perfectly matching in one particle. What we see is not two entangled particles but one particle that belongs to two very close universes. Close in a different sense, not spatial proximity as we know it, but close enough to share at least one particle in all its observable attributes but space position.
Kirsty Foster's curator insight, October 31, 2014 9:24 AM

kirsty

Vloasis's curator insight, October 31, 2014 2:56 PM

Much to ponder.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Isotope effect produces new type of chemical bond - the vibrational muonium bond

Isotope effect produces new type of chemical bond - the vibrational muonium bond | Amazing Science | Scoop.it

Researchers believe they have confirmed the existence of a new type of chemical bond, first proposed some 30 years ago but never convincingly demonstrated because of the lack of experimental evidence and the relatively poor accuracy of the quantum chemistry methods that prevailed at the time.1 The new work also shows how substituting isotopes can result in fundamental changes in the nature of chemical bonding.


In the early 1980s it was proposed that in certain transition states consisting of a very light atom sandwiched between two heavy ones, the system would be stabilised not by conventional van der Waal’s forces, but by vibrational bonding, with the light atom shuttling between its two neighbours. However, despite several groups searching for such a system none was demonstrated and the hunt fizzled out.

Now, Jörn Manz, of the Free University of Berlin and Shanxi University in China, and colleagues believe they have the theoretical and experimental evidence to demonstrate a stable vibrational bond.


The researchers carried out a series of theoretical experiments looking at the reaction of BrH with Br to create the radical BrHBr, but using different isotopes of hydrogen. By using muons – elementary particles that are similar to an electron but have greater mass – the team added a range of hydrogen isotopes to BrHBr from the relatively hefty muonic helium4H, to the extremely light muonium, Mu, with a mass nearly 40 times smaller than 4H.


The team mapped two key parameters: the potential energy surface of the system – the three-dimensional potential energy ‘landscape’ relating the energy of the surface, with hills and valleys – to the geometry; and a quantum mechanical parameter, the vibrational zero point energy or ZPE.


Classically, a bond will form if there is a net reduction in the potential energy of the system. However, in certain circumstances, if there is a sufficiently large decrease in the vibrational ZPE, this can overcome the need for a decrease in potential energy and the system can be stabilised by a vibrational bond.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

WIRED: A Brief History of Mind-Bending Ideas About Black Holes

WIRED: A Brief History of Mind-Bending Ideas About Black Holes | Amazing Science | Scoop.it

The story starts in 1784, when a geologist named John Michell was thinking deeply about Isaac Newton’s theory of gravity. In Newtonian physics, a cannonball can be shot into orbit around the Earth if it surpasses a particular speed, known as the planet’s escape velocity.


This speed depends on the mass and radius of the object you are trying to escape from. Michell’s insight was to imagine a body whose escape velocity was so great that it exceeded the speed of light – 300,000 kilometers per second – first measured in 1676 by the Danish astronomer Ole Romer.


Michell presented his results to other scientists, who speculated that massive “dark stars” might exist in abundance in the sky but be invisible because light can’t escape their surfaces. The French mathematician Pierre-Simon Laplace later made an independent discovery of these “dark stars” and both luminaries correctly calculated the very small radius – 6 kilometers – such an object would have if it were as massive as our sun.


After the revolutions of 20th century physics, black holes got much weirder. In 1916, a short while after Einstein published the complex equations underpinning General Relativity (which Einstein himself couldn’t entirely solve), a German astronomer named Karl Schwarzschild showed that a massive object squeezed to a single point would warp space around it so much that even light couldn’t escape. Though the cartoon version of black holes has them sucking everything up like a vacuum cleaner, light would only be unable to escape Schwarzschild’s object if it was inside a particular radius, called the Schwarzschild radius. Beyond this “event horizon,” you could safely leave the vicinity of a black hole.


Neither Schwarzschild nor Einstein believed this object was anything other than a mathematical curiosity. It took a much better understanding of the lives of stars before black holes were taken seriously. You see, a star only works because it preserves a delicate balance between gravity, which is constantly trying to pull its mass inward, and the nuclear furnace in its belly, which exerts pressure outward. At some point a star runs out of fuel and the fusion at its core turns off. Gravity is given the upper hand, causing the star to collapse. For stars like our sun, this collapse is halted when the electrons in the star’s atoms get so close that they generate a quantum mechanical force called electron degeneracy pressure. An object held up by this pressure is called a white dwarf.


In 1930, the Indian physicist Subrahmanyan Chandrasekhar showed that, given enough mass, a star’s gravity could overcome this electron degeneracy pressure, squeezing all its protons and electrons into neutrons. Though a neutron degeneracy pressure could then hold the weight up, forming a neutron star, the physicist Robert Oppenheimer found that an even more massive object could overcome this final outward pressure, allowing gravity to win and crushing everything down to a single point. Scientists slowly accepted that these things were real objects, not just weird mathematical solutions to the equations of General Relativity. In 1967, physicist John Wheeler used the term “black hole” to describe them in a public lecture, a name that has stuck ever since.



more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New evidence for an exotic, predicted superconducting state found

New evidence for an exotic, predicted superconducting state found | Amazing Science | Scoop.it
A research team led by a Brown University physicist has produced new evidence for an exotic superconducting state, first predicted a half-century ago, that can arise when a superconductor is exposed to a strong magnetic field.


"It took 50 years to show that this phenomenon indeed happens," said Vesna Mitrovic, associate professor of physics at Brown University, who led the work. "We have identified the microscopic nature of this exotic quantum state of matter."


The research is published in Nature Physics.


Superconductivity—the ability to conduct electric current without resistance—depends on the formation of electron twosomes known as Cooper pairs (named for Leon Cooper, a Brown University physicist who shared the Nobel Prize for identifying the phenomenon). In a normal conductor, electrons rattle around in the structure of the material, which creates resistance. But Cooper pairs move in concert in a way that keeps them from rattling around, enabling them to travel without resistance.


Magnetic fields are the enemy of Cooper pairs. In order to form a pair, electrons must be opposites in a property that physicists refer to as spin. Normally, a superconducting material has a roughly equal number of electrons with each spin, so nearly all electrons have a dance partner. But strong magnetic fields can flip "spin-down" electrons to "spin-up", making the spin population in the material unequal.


"The question is what happens when we have more electrons with one spin than the other," Mitrovic said. "What happens with the ones that don't have pairs? Can we actually form superconducting states that way, and what would that state look like?"


In 1964, physicists predicted that superconductivity could indeed persist in certain kinds of materials amid a magnetic field. The prediction was that the unpaired electrons would gather together in discrete bands or stripes across the superconducting material. Those bands would conduct normally, while the rest of the material would be superconducting. This modulated superconductive state came to be known as the FFLO phase, named for theorists Peter Fulde, Richard Ferrell, Anatoly Larkin, and Yuri Ovchinniko, who predicted its existence. To investigate the phenomenon, Mitrovic and her team used an organic superconductor with the catchy name κ-(BEDT-TTF)2Cu(NCS)2. The material consists of ultra-thin sheets stacked on top of each other and is exactly the kind of material predicted to exhibit the FFLO state.


After applying an intense magnetic field to the material, Mitrovic and her collaborators from the French National High Magnetic Field Laboratory in Grenoble probed its properties using nuclear magnetic resonance (NMR). What they found were regions across the material where unpaired, spin-up electrons had congregated. These "polarized" electrons behave, "like little particles constrained in a box," Mitrovic said, and they form what are known as Andreev bound states.

more...
No comment yet.