Amazing Science
Follow
Find tag "cosmology"
351.3K views | +131 today
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

High-energy gamma ray bursts have 100-times the energy output of a supernova

High-energy gamma ray bursts have 100-times the energy output of a supernova | Amazing Science | Scoop.it

In the 1960s a series of satellites were built as part of Project Vela.  Project Vela was intended to detect violations of the 1963 ban on above ground testing of nuclear weapons.  The Vela satellites were designed to detect bursts of gamma rays, which are high energy electromagnetic waves produced by radioactive decay.  If any nuclear weapon was detonated in space, the resulting radioactive decay would release a large amount of gamma rays which would be detected by the Vela satellites.

In 1967, two of the Vela probes detected a large spike of gamma rays.  But the signature of this spike was very different from those of a nuclear explosion.  Soon more gamma ray spikes were detected, and these likewise differed from the expected signature of a nuclear test.  Since the bursts were observed by multiple satellites, the Vela team was able to compare the arrival of the bursts between different satellites, and it soon became clear that the bursts had an extraterrestrial source.  Of course the Vela project was classified, so it wasn't until 1973 that the results were declassified and published in Astrophysical Journal.  It was only then that astronomers were made aware of these gamma ray bursts (GRBs).


We now know that GRBs are very common.  On average, about one gamma ray burst occurs every day.   They appear randomly in all directions of the sky, and this means they aren't produced in our galaxy.  If they were, then GRBs would mostly be found along the plane of the Milky Way.


Some gamma ray bursts (known as long bursts) can last more than two seconds.  These bursts have afterglow caused by gamma rays colliding with interstellar material near the event, causing the emission of light at other wavelengths.  This afterglow allows us to measure the redshift of these events, and what we find is that they are quite distant.  The closest observed gamma ray burst occurred at a distance of 100 million light years, and many occurred billions of light years away.


We aren't entirely sure what causes a gamma ray burst.  Because of their distance, and apparent brightness, they must be extraordinarily energetic, with about 100 times more energy than a supernova.  They may be caused by huge supernova explosions known as hypernova, or they may be caused by supernova explosions occur with a rotational axis pointing in our direction, causing a jet-like burst of energy.  Short burst GRBs, lasting less than 2 seconds, may be due to collisions between neutron stars.


Given the huge energy of GRBs, one might wonder if one could occur in our galaxy.  Given the average rate of GRBs and the huge distances at which they typically occur, the rate at which one happens in our galaxy is probably about once every 5 million years.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Do we live in a 2D hologram? New Fermilab experiment will test the nature of the universe

Do we live in a 2D hologram? New Fermilab experiment will test the nature of the universe | Amazing Science | Scoop.it
A unique experiment at the U.S. Department of Energy's Fermi National Accelerator Laboratory called the Holometer has started collecting data that will answer some mind-bending questions about our universe – including whether we live in a hologram.


Much like characters on a television show would not know that their seemingly 3D world exists only on a 2D screen, we could be clueless that our 3D space is just an illusion. The information about everything in our universe could actually be encoded in tiny packets in two dimensions. Get close enough to your TV screen and you'll see pixels, small points of data that make a seamless image if you stand back. Scientists think that the universe's information may be contained in the same way, and that the natural "pixel size" of space is roughly 10 trillion trillion times smaller than an atom, a distance that physicists refer to as the Planck scale.


"We want to find out whether space-time is a quantum system just like matter is," said Craig Hogan, director of Fermilab's Center for Particle Astrophysics and the developer of the holographic noise theory. "If we see something, it will completely change ideas about space we've used for thousands of years."


Quantum theory suggests that it is impossible to know both the exact location and the exact speed of subatomic particles. If space comes in 2D bits with limited information about the precise location of objects, then space itself would fall under the same theory of uncertainty . The same way that matter continues to jiggle (as quantum waves) even when cooled to absolute zero, this digitized space should have built-in vibrations even in its lowest energy state.


Essentially, the experiment probes the limits of the universe's ability to store information. If there are a set number of bits that tell you where something is, it eventually becomes impossible to find more specific information about the location – even in principle. The instrument testing these limits is Fermilab's Holometer, or holographic interferometer, the most sensitive device ever created to measure the quantum jitter of space itself.


Now operating at full power, the Holometer uses a pair of interferometers placed close to one another. Each one sends a one-kilowatt laser beam (the equivalent of 200,000 laser pointers) at a beam splitter and down two perpendicular 40-meter arms. The light is then reflected back to the beam splitter where the two beams recombine, creating fluctuations in brightness if there is motion. Researchers analyze these fluctuations in the returning light to see if the beam splitter is moving in a certain way – being carried along on a jitter of space itself.


"Holographic noise" is expected to be present at all frequencies, but the scientists' challenge is not to be fooled by other sources of vibrations. The Holometer is testing a frequency so high – millions of cycles per second – that motions of normal matter are not likely to cause problems. Rather, the dominant background noise is more often due to radio waves emitted by nearby electronics. The Holometer experiment is designed to identify and eliminate noise from such conventional sources.


"If we find a noise we can't get rid of, we might be detecting something fundamental about nature–a noise that is intrinsic to spacetime," said Fermilab physicist Aaron Chou, lead scientist and project manager for the Holometer. "It's an exciting moment for physics. A positive result will open a whole new avenue of questioning about how space works."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Fascinating rhythm: Light pulses illuminate a rare black hole

Fascinating rhythm: Light pulses illuminate a rare black hole | Amazing Science | Scoop.it

The universe has so many black holes that it's impossible to count them all. There may be 100 million of these intriguing astral objects in our galaxy alone. Nearly all black holes fall into one of two classes: big, and colossal. Astronomers know that black holes ranging from about 10 times to 100 times the mass of our sun are the remnants of dying stars, and that supermassive black holes, more than a million times the mass of the sun, inhabit the centers of most galaxies.


But scattered across the universe like oases in a desert are a few apparent black holes of a more mysterious type. Ranging from a hundred times to a few hundred thousand times the sun's mass, these intermediate-mass black holes are so hard to measure that even their existence is sometimes disputed. Little is known about how they form. And some astronomers question whether they behave like other black holes.


Now a team of astronomers has accurately measured—and thus confirmed the existence of—a black hole about 400 times the mass of our sun in a galaxy 12 million light years from Earth. The finding, by University of Maryland astronomy graduate student Dheeraj Pasham and two colleagues, was published online August 17 in the journal Nature.


Pasham focused on one object in Messier 82, a galaxy in the constellation Ursa Major. Messier 82 is our closest "starburst galaxy," where young stars are forming. Beginning in 1999 a NASA satellite telescope, the Chandra X-ray Observatory, detected X-rays in Messier 82 from a bright object prosaically dubbed M82 X-1. Astronomers, including Mushotzky and co-author Tod Strohmayer of NASA's Goddard Space Flight Center, suspected for about a decade that the object was an intermediate-mass black hole, but estimates of its mass were not definitive enough to confirm that.


Between 2004 and 2010 NASA's Rossi X-Ray Timing Explorer (RXTE) satellite telescope observed M82 X-1 about 800 times, recording individual x-ray particles emitted by the object. Pasham mapped the intensity and wavelength of x-rays in each sequence, then stitched the sequences together and analyzed the result

.

Among the material circling the suspected black hole, he spotted two repeating flares of light. The flares showed a rhythmic pattern of light pulses, one occurring 5.1 times per second and the other 3.3 times per second – or a ratio of 3:2.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

A possible signal from dark matter?

A possible signal from dark matter? | Amazing Science | Scoop.it

Galaxies are often found in groups or clusters, the largest known aggregations of matter and dark matter. The Milky Way, for example, is a member of the "Local Group" of about three dozen galaxies, including the Andromeda Galaxy located about 2 million light-years away. Very large clusters can contain thousands of galaxies, all bound together by gravity. The closest large cluster of galaxies to us, the Virgo Cluster with about 2000 members, is about 50 million light-years away. The Perseus Cluster is one of the most massive objects in the Universe with thousands of galaxies immersed in an enormous cloud of superheated gas.


The space between galaxies is not empty. It is filled with hot intergalactic gas whose temperature is of order ten million kelvin, or even higher. The gas is enriched with heavy elements that escape from the galaxies and accumulate in the intracluster medium over billions of years of galactic and stellar evolution. These intracluster gas elements can be detected from their emission lines in X-ray, and include oxygen, neon, magnesium, silicon, sulfur, argon, calcium, iron, nickel, and even chromium and manganese.


The relative abundances of these elements contain valuable information on the rate of supernovae in the different types of galaxies in the clusters since supernovae make and/or disburse them into the gas. Therefore it came as something of a surprise when CfA astronomers and their colleagues discovered a faint line corresponding to no known element. Esra Bulbul, Adam Foster, Randall Smith, Scott Randall and their team were studying the averaged X-ray spectrum of a set of seventy-three clusters (including Virgo) looking for emission lines too faint to be seen in any single one when they uncovered a line with no known match in a particular spectral interval not expected to have any features.


The scientists propose a tantalizing suggestion: the line is the result of the decay of a putative, long-sought-after dark matter particle, the so-called sterile neutrino. It had been suggested that the hot X-ray emitting gas in a galaxy cluster might be a good place to look for dark matter signatures, and if the sterile neutrino result is confirmed it would mark a breakthrough in dark matter research (it is of course possible that it is a statistical or other error). Recent unpublished results from another group tend to support the detection of this feature; the team suggests that observations with the planned Japanese Astro-H X-ray mission in 2015 will be critical to confirm and resolve the nature of this line.


More information: "Detection of an Unidentified Emission Line in the Stacked X-Ray Spectrum of Galaxy Clusters," Esra Bulbul, Maxim Markevitch, Adam Foster, Randall K. Smith, Michael Loewenstein, and Scott W. Randall, ApJ 789, 13, 2014.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New correction to speed of light could explain SN1987 dual-pulse neutrino burst

New correction to speed of light could explain SN1987 dual-pulse neutrino burst | Amazing Science | Scoop.it

The effect of gravity on virtual electron–positron pairs as they propagate through space could lead to a violation of Einstein's equivalence principle, according to calculations by James Franson at the University of Maryland, Baltimore County. While the effect would be too tiny to be measured directly using current experimental techniques, it could explain a puzzling anomaly observed during the famous SN1987 supernova of 1987.


In modern theoretical physics, three of the four fundamental forces – electromagnetism, the weak nuclear force and the strong nuclear force – are described by quantum mechanics. The fourth force, gravity, does not currently have a quantum formulation and is best described by Einstein's general theory of relativity. Reconciling relativity with quantum mechanics is therefore an important and active area of physics.


An open question for theoretical physicists is how gravity acts on a quantum object such as a photon. Astronomical observations have shown repeatedly that light is attracted by a gravitational field. Traditionally, this is described using general relativity: the gravitational field bends space–time, and the light is slowed down (and slightly deflected) as it passes through the curved region. In quantum electrodynamics, a photon propagating through space can occasionally annihilate with itself, creating a virtual electron–positron pair. Soon after, the electron and positron recombine to recreate the photon. If they are in a gravitational potential then, for the short time they exist as massive particles, they feel the effect of gravity. When they recombine, they will create a photon with an energy that is shifted slightly and that travels slightly slower than if there was no gravitational potential.


Franson scrutinized these two explanations for why light slows down as it passes through a gravitational potential. He decided to calculate how much the light should slow down according to each theory, anticipating that he would get the same answer. However, he was in for a surprise: the predicted changes in the speed of light do not match, and the discrepancy has some very strange consequences.


Franson calculated that, treating light as a quantum object, the change in a photon's velocity depends not on the strength of the gravitational field, but on the gravitational potential itself. However, this leads to a violation of Einstein's equivalence principle – that gravity and acceleration are indistinguishable – because, in a gravitational field, the gravitational potential is created along with mass, whereas in a frame of reference accelerating in free fall, it is not. Therefore, one could distinguish gravity from acceleration by whether a photon slows down or not when it undergoes particle–antiparticle creation.


An important example is a photon and a neutrino propagating in parallel through space. A neutrino cannot annihilate to create an electron–positron pair, so the photon will slow down more than the neutrino as they pass through a gravitational field, potentially letting the neutrino travel faster than light through that region of space. However, if the problem is viewed in a frame of reference falling freely into the gravitational field, neither the photon nor the neutrino slows down at all, so the photon continues to travel faster than the neutrino.


While the idea that the laws of physics can be dependent on one's frame of reference seems nonsensical, it could explain an anomaly in the 1987 observation of supernova SN1987a. An initial pulse of neutrinos was detected 7.7 hours before the first light from SN1987a reached Earth. This was followed by a second pulse of neutrinos, which arrived about three hours before the supernova light. Supernovae are expected to emit large numbers of neutrinos and the three-hour gap between the second burst of neutrinos and the arrival of the light agrees with the current theory of how a star collapses to create a supernova.


The first pulse of neutrinos is generally thought to be unrelated to the supernova. However, the probability of such a coincidence is statistically unlikely. If Franson's results are correct, then the 7.7-hour gap between the first pulse of neutrinos and the arrival of the light could be explained by the gravitational potential of the Milky Way slowing down the light. This does not explain why two neutrino pulses preceded the light, but Franson suggests the second pulse could be related to a two-step collapse of the star.


The research is published in the New Journal of Physics.

more...
Sarah Clarke's curator insight, August 2, 10:51 AM

I love this..."NEW' correction to the speed of light.  Since when did we start messing about with it?

Rescooped by Dr. Stefan Gruenwald from Fragments of Science
Scoop.it!

Quantum bounce could make black holes explode

Quantum bounce could make black holes explode | Amazing Science | Scoop.it
If space-time is granular, it could reverse gravitational collapse and turn it into expansion.


Black holes might end their lives by transforming into their exact opposite — 'white holes' that explosively pour all the material they ever swallowed into space, say two physicists. The suggestion, based on a speculative quantum theory of gravity, could solve a long-standing conundrum about whether black holes destroy information.


The theory suggests that the transition from black hole to white hole would take place right after the initial formation of the black hole, but because gravity dilates time, outside observers would see the black hole lasting billions or trillions of years or more, depending on its size. If the authors are correct, tiny black holes that formed during the very early history of the Universe would now be ready to pop off like firecrackers and might be detected as high-energy cosmic rays or other radiation. In fact, they say, their work could imply that some of the dramatic flares commonly considered to be supernova explosions could in fact be the dying throes of tiny black holes that formed shortly after the Big Bang.


Albert Einstein’s general theory of relativity predicts that when a dying star collapses under its own weight, it can reach a stage at which the collapse is irreversible and no known force of nature can stop it. This is the formation of a black hole: a spherical surface, known as the event horizon, appears, shrouding the star inside from outside observers while it continues to collapse, because nothing — not even light or any other sort of information — can escape the event horizon.


Because dense matter curves space, ‘classical’ general relativity predicts that the star inside will continue to shrink into what is known as a singularity, a region where matter is infinitely dense and space is infinitely curved. In such situations, the known laws of physics cease to be useful.


Many physicists, however, believe that at some stage in this process, quantum-gravity effects should take over, arresting the collapse and avoiding the infinities.



Via Mariaschnee
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

'Superfluid spacetime' points to unification of physics

'Superfluid spacetime' points to unification of physics | Amazing Science | Scoop.it
Thinking of space and time as a liquid might help reconcile quantum mechanics and relativity.


If spacetime is like a liquid — a concept some physicists say could help resolve a confounding disagreement between two dominant theories in physics — it must be a very special liquid indeed. A recent study compared astrophysical observations with predictions based on the notion of fluid spacetime, and found the idea only works if spacetime is incredibly smooth and freely flowing — in other words, a superfluid.


Thinking of spacetime as a liquid may be a helpful analogy. We often picture space and time as fundamental backdrops to the universe. But what if they are not fundamental, and built instead of smaller ingredients that exist on a deeper layer of reality that we cannot sense? If that were the case, spacetime’s properties would “emerge” from the underlying physics of its constituents, just as water’s properties emerge from the particles that comprise it. “Water is made of discrete, individual molecules, which interact with each other according to the laws of quantum mechanics, but liquid water appears continuous and flowing and transparent and refracting,” explains Ted Jacobson, a physicist at the University of Maryland, College Park. “These are all ‘emergent’ properties that cannot be found in the individual molecules, even though they ultimately derive from the properties of those molecules.”


Physicists have been considering this possibility since the 1990s in an attempt to reconcile the dominant theory of gravity on a large scale — general relativity — with the theory governing the very smallest bits of the universe—quantum mechanics. Both theories appear to work perfectly within their respective domains, but conflict with one another in situations that combine the large and small, such as black holes (extremely large mass, extremely small volume). Many physicists have tried to solve the problem by 'quantizing' gravity — dividing it into smaller bits, just as quantum mechanics breaks down many quantities, such as particles’ energy levels, into discrete packets. “There are many attempts to quantize gravity—string theory and loop quantum gravity are alternative approaches that can both claim to have gone a good leg forward,” says Stefano Liberati, a physicist at the International School for Advanced Studies (SISSA) in Trieste, Italy. “But maybe you don’t need to quantize gravity; you need to quantize this fundamental object that makes spacetime.”


Liberati, along with his colleague Luca Maccione of Ludwig Maximilian University in Munich, recently explored how that idea would affect light traveling through the universe. An emergent spacetime, one that acted like a fluid, would not be immediately distinguishable from the spacetime of any other theory. But in extreme situations, such as for very energetic light particles, Liberati and Maccione found that some differences would be noticeable. In fact, by examining observations of high-energy photons flying across the universe from the Crab Nebula, the physicists were able to rule out certain versions of emergent spacetime, finding that if it is a fluid at all, it must be a superfluid. The researchers published their results in Physical Review Letters1 in April.


In this analogy particles would travel through spacetime like waves in an ocean, and the laws of fluid mechanics — condensed-matter physics — would apply. Previously physicists considered how particles of different energies would disperse in spacetime, just as waves of different wavelengths disperse, or travel at different speeds, in water. In the latest study Liberati and Maccione took into account another fluid effect: dissipation. As waves travel through a medium, they lose energy over time. This dampening effect would also happen to photons traveling through spacetime, the researchers found. Although the effect is small, high-energy photons traveling very long distances should lose a noticeable amount of energy, the researchers say.


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Space-based experiment could test gravity's effects on quantum entanglement

Space-based experiment could test gravity's effects on quantum entanglement | Amazing Science | Scoop.it

Physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena. In a new proposed experiment in this area, two toaster-sized "nanosatellites" carrying entangled condensates orbit around the Earth, until one of them moves to a different orbit with a different gravitational field strength. As a result of the change in gravity, the entanglement between the condensates is predicted to degrade by up to 20%. Experimentally testing the proposal may be possible in the near future.


"Our work shows that it is possible to test gravitational effects, which are thought to affect classical systems at large and very large scales, with genuinely (small) entangled quantum systems," Bruschi told Phys.org. "Our results aid the understanding of the effects of relativity on entanglement, an important resource for quantum information processing. Since we lack a theory that merges quantum theory and relativity, our work can help direct future theoretical and experimental efforts that investigate quantum effects at large scales."


Besides being of fundamental interest, understanding how gravity and other relativistic features affect quantum entanglement will help physicists develop quantum technologies for space-based applications. In a sense, space-based quantum technologies will take classical space-based technologies such as GPS into the quantum regime. It's well-known that GPS satellites require relativistic corrections to accurately determine time and position, and the same will hold true for quantum technologies.


While GPS is widespread, however, quantum technologies have not yet been developed for the space environment, although several ideas have been proposed. While most of these proposals fall under the framework of the theory of quantum mechanics, the new proposal differs in that is developed within the framework of quantum field theory. This theory merges quantum theory and relativity in the sense that matter and radiation are quantized, while space time is treated as a classical background. The physicists here argue that quantum field theory provides a better model for understanding the effects of gravity on quantum properties.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Distant Black Hole Spins at Half the Speed of Light

Distant Black Hole Spins at Half the Speed of Light | Amazing Science | Scoop.it
Back when the universe was half its present age, supermassive black holes were feeding from a steady and plentiful diet of neighboring galaxies, the first measurement of a distant supermassive black hole’s spin shows.


Taking advantage of a naturally occurring zoom lens in space, astronomers analyzed X-rays streaming from near the mouth of a supermassive black hole powering a quasar about 6 billion light years from Earth.


“The ‘lens’ galaxy acts like a natural telescope, magnifying the light from the faraway quasar,” University of Michigan astronomer Rubens Reis explains in a paper published in this week’s Nature.


Analyzing four magnified images created by the lens galaxy -- an elliptical galaxy about 3 billion light years away -- Reis and colleagues found that the quasar’s black hole is spinning at half the speed of light.


The spin rate directly relates to how black holes feed and grow: The steadier the diet, the faster the spin, computer models show. “If the mass accretion was more messy it would suggest that the black hole would have a lower spin,” astronomer Mark Reynolds, also with University of Michigan, told Discovery News.


“What we found in this system is that it’s spinning very rapidly,” Reynolds said, consuming mass equivalent to about one sun per year. Spin rates may evolve over time, reflecting changes in evolution of galaxies.


At some distance, the black holes’ spins might be even higher, approaching light speed, and then slow down to RX J1131’s spin rate.


“If we go back further, maybe they’ll all be maximally spinning because of more mergers and more things happening. Or maybe they’ll be less spinning. We can theoretically produce both scenarios at the moment,” Reynolds said.


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

BOSS uses quasars to track the expanding universe—most precise measurement yet

BOSS uses quasars to track the expanding universe—most precise measurement yet | Amazing Science | Scoop.it

The Baryon Oscillation Spectroscopic Survey (BOSS), the largest component of the third Sloan Digital Sky Survey (SDSS-III), pioneered the use of quasars to map density variations in intergalactic gas at high redshifts, tracing the structure of the young universe. BOSS charts the history of the universe's expansion in order to illuminate the nature of dark energy, and new measures of large-scale structure have yielded the most precise measurement of expansion since galaxies first formed.

he latest quasar results combine two separate analytical techniques. A new kind of analysis, led by physicist Andreu Font-Ribera of the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and his team, was published late last year. Analysis using a tested approach, but with far more data than before, has just been published by Timothée Delubac, of EPFL Switzerland and France's Centre de Saclay, and his team. The two analyses together establish the expansion rate at 68 kilometers per second per million light years at redshift 2.34, with an unprecedented accuracy of 2.2 percent.


"This means if we look back to the universe when it was less than a quarter of its present age, we'd see that a pair of galaxies separated by a million light years would be drifting apart at a velocity of 68 kilometers a second as the universe expands," says Font-Ribera, a postdoctoral fellow in Berkeley Lab's Physics Division. "The uncertainty is plus or minus only a kilometer and a half per second." Font-Ribera presented the findings at the April 2014 meeting of the American Physical Society in Savannah, GA.


BOSS employs both galaxies and distant quasars to measure baryon acoustic oscillations (BAO), a signature imprint in the way matter is distributed, resulting from conditions in the early universe. While also present in the distribution of invisible dark matter, the imprint is evident in the distribution of ordinary matter, including galaxies, quasars, and intergalactic hydrogen.


"Three years ago BOSS used 14,000 quasars to demonstrate we could make the biggest 3-D maps of the universe," says Berkeley Lab's David Schlegel, principal investigator of BOSS. "Two years ago, with 48,000 quasars, we first detected baryon acoustic oscillations in these maps. Now, with more than 150,000 quasars, we've made extremely precise measures of BAO."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Gas from another galaxy is hitting our own, triggering the birth of bright new stars in the Milky Way

Gas from another galaxy is hitting our own, triggering the birth of bright new stars in the Milky Way | Amazing Science | Scoop.it

For the first time astronomers have detected stars in an enormous stream of gas shed by the Magellanic Clouds, the two brightest galaxies that orbit our own.


Sought for decades, the newfound stars are young, which means they formed recently, while the Magellanic gas collided with gas in the Milky Way. The newborn stars offer insight into processes that occurred in the ancient universe, when small, gas-rich galaxies smashed together to give rise to giants like the Milky Way. "This is the one and only galaxy interaction we can model in very much detail," says Dana Casetti-Dinescu, an astronomer at Southern Connecticut State University, who notes that other collisions of gas clouds between galaxies are farther away and thus harder to observe. "For more distant systems that interact, we don't have the wealth of information."


Some two dozen galaxies revolve around our own but only the Magellanic Clouds shine so brightly that stargazers can see the pair with the naked eye. What really sets these two apart is their vigor: Unlike all other Milky Way satellites, the Magellanic Clouds abound with gas, the raw material galaxies use to create new stars.


The Magellanic Clouds are certainly nearby: The Large Magellanic Cloud is just 160,000 light-years from Earth, whereas the Small Magellanic Cloud  is 200,000 light-years distant and 75,000 light-years away from its partner. As the two galaxies orbit the Milky Way, they probably orbit each another, too.


A closer look at the Magellanic Clouds reveals more details. In the early 1970s radio astronomers discovered a long stream of gas that trails behind the two galaxies in their orbit around us. This gas, named the Magellanic Stream, consists mostly of neutral hydrogen atoms, which broadcast radio waves that are 21 centimeters long. A shorter gaseous component leads the Magellanic Clouds and is therefore called the Leading Arm. From the tip of the Leading aArm to the far end of the Magellanic Stream, this gaseous strand is at least 200 degrees long and stretches across more than half a million light-years of space.


Just as the moon lifts the terrestrial seas, the Large Magellanic Cloud's gravitational pull has torn most of this gas out of the Small Magelleanic Cloud, whose grasp on its contents is less secure. Stars should also have spilled out of the Magellanic Clouds. Although both stars and gas exist between the Magellanic Clouds, no one has ever found any stars in either the Magellanic Stream or the Leading Arm.


Until now. Casetti-Dinescu and her colleagues used the 6.5-meter Walter Baade telescope at Las Campanas Observatory in Chile to uncover six luminous blue stars in the Leading Arm. "They are formed in situ," she says. "They have to be, because they're too young—they don't have enough time to travel from the Clouds to their current location in their lifetime." Five of the six stars are about 60,000 light-years from the Milky Way's center, near the periphery of our galaxy's disk of stars.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Spectacular cosmic discovery hailed: Gravitational waves put twist pattern on CMB

Spectacular cosmic discovery hailed: Gravitational waves put twist pattern on CMB | Amazing Science | Scoop.it

Scientists say they have extraordinary new evidence to support a Big Bang Theory for the origin of the Universe. Researchers believe they have found the signal left in the sky by the super-rapid expansion of space that must have occurred just fractions of a second after everything came into being.


It takes the form of a distinctive twist in the oldest light detectable with telescopes. The work will be scrutinised carefully, but already there is talk of a Nobel. "This is spectacular," commented Prof Marc Kamionkowski, from Johns Hopkins University.


"I've seen the research; the arguments are persuasive, and the scientists involved are among the most careful and conservative people I know," he said.


The breakthrough was announced by an American team working on a project known as BICEP2This has been using a telescope at the South Pole to make detailed observations of a small patch of sky.


The aim has been to try to find a residual marker for "inflation" - the idea that the cosmos experienced an exponential growth spurt in its first trillionth, of a trillionth of a trillionth of a second.


Theory holds that this would have taken the infant Universe from something unimaginably small to something about the size of a marble. Space has continued to expand for the nearly 14 billion years since.


Inflation was first proposed in the early 1980s to explain some aspects of Big Bang Theory that appeared to not quite add up, such as why deep space looks broadly the same on all sides of the sky. The contention was that a very rapid expansion early on could have smoothed out any unevenness.


But inflation came with a very specific prediction - that it would be associated with waves of gravitational energy, and that these ripples in the fabric of space would leave an indelible mark on the oldest light in the sky - the famous Cosmic Microwave Background.


The BICEP2 team says it has now identified that signal. Scientists call it B-mode polarisation. It is a characteristic twist in the directional properties of the CMB. Only the gravitational waves moving through the Universe in its inflationary phase could have produced such a marker. It is a true "smoking gun".


"Detecting this signal is one of the most important goals in cosmology today. A lot of work by a lot of people has led up to this point," said Prof John Kovac of the Harvard-Smithsonian Center for Astrophysics and a leader of the BICEP2 collaboration.


The signal is reported to be quite a bit stronger than many scientists had dared hope. This simplifies matters, say experts. It means the more exotic models for how inflation worked are no longer tenable.


The results also constrain the energies involved - at 10,000 trillion gigaelectronvolts. This is consistent with ideas for what is termed Grand Unified Theory, the realm where particle physicists believe three of the four fundamental forces in nature can be tied together.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Distant quasar lights up cosmic web like a neon sign

Distant quasar lights up cosmic web like a neon sign | Amazing Science | Scoop.it

That the Universe is largely composed of a cosmic web consisting of narrow filaments upon which galaxies and intergalactic gas and dust are concentrated has been known for more than a decade. While a great deal of evidence for this has accumulated, visual evidence has been difficult to find. Astronomers have now photographed what appears to be a segment of a cosmic filament stimulated into fluorescence by irradiation from a nearby quasar.


The filaments of the cosmic web are difficult to see visually. They consist primarily of dark matter and intergalactic gas and dust, none of which have a visible signature detectable across billions of light years. As a result, our knowledge of filaments primarily comes from gravitational lensing studies, radio observations, and x-ray telescopes.


Now a team, led by researchers at the University of California, Santa Cruz (UCSC), has found an unusual configuration of celestial objects that appears to make visible a part of a filament that is ten billion light years distant. The section of the filament that is visible takes the form of a huge asymmetric nebula of diffuse intergalactic gas.


Normally this gas would not emit significant amounts of light, but in this case the intergalactic gas is being irradiated by extreme UV light from a nearby quasar; the active center of a galaxy. This irradiation ionized the gas (mostly consisting of atomic hydrogen), which then emits the characteristic light of atomic hydrogen (Lyman-alpha radiation) when the ionized atoms regain their electrons. When redshift (z~2.27) is taken into account, the Lyman-alpha radiation appears to our instruments as a violet glow.


The map above is also a product of the SDSS, which used a 2.5 meter telescope to image and determine redshift (and thereby distance) for galaxies in the cosmic vicinity of the Milky Way galaxy. It includes galaxies and quasars located in a thin slice of the sky above the Earth's equator out to a distance of two billion light years. One's first impression is of a slice through a foam of luminous bodies that lay on the boundary of huge voids.


Rather solid evidence also exists for the existence of filaments with a goodly share of dark matter, as illustrated in the above figure of just such a dark matter filament. This filament stretches about sixty million light years between the galaxy clusters Abell 222 and 223. X-ray emissions from the filament suggest that nearly 10 percent of the filament's mass consists of hot gas. This filament comprises at least dark matter and intergalactic gas.


The team published a report in the January 19, 2014, issue of Nature of their discovery of a rather unusual configuration of celestial objects in the early history of the Universe (about three billion years after the Big Bang) that provides additional evidence for the existence of the cosmic web. 


more...
Greg Wurn's curator insight, March 4, 6:35 PM

The time and distances discussed in this article amaze me, "billions of light years", that's light traveling at 186,000 miles a second for billions of years, the distances involved in space travel should be enough to convince anyone that our best hope for the future by far is to look after this amazing planet !

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Hubble Helps to Find Smallest Known Galaxy Containing a Supermassive Black Hole

Hubble Helps to Find Smallest Known Galaxy Containing a Supermassive Black Hole | Amazing Science | Scoop.it

Astronomers using data from NASA’s Hubble Space Telescope and ground observation have found an unlikely object in an improbable place -- a monster black hole lurking inside one of the tiniest galaxies ever known.


The black hole is five times the mass of the one at the center of our Milky Way galaxy. It is inside one of the densest galaxies known to date -- the M60-UCD1 dwarf galaxy that crams 140 million stars within a diameter of about 300 light-years, which is only 1/500th of our galaxy’s diameter.

If you lived inside this dwarf galaxy, the night sky would dazzle with at least 1 million stars visible to the naked eye. Our nighttime sky as seen from Earth’s surface shows 4,000 stars.


The finding implies there are many other compact galaxies in the universe that contain supermassive black holes. The observation also suggests dwarf galaxies may actually be the stripped remnants of larger galaxies that were torn apart during collisions with other galaxies rather than small islands of stars born in isolation.


“We don’t know of any other way you could make a black hole so big in an object this small,” said University of Utah astronomer Anil Seth, lead author of an international study of the dwarf galaxy published in Thursday’s issue of the journal Nature.


Seth’s team of astronomers used the Hubble Space Telescope and the Gemini North 8-meter optical and infrared telescope on Hawaii’s Mauna Kea to observe M60-UCD1 and measure the black hole’s mass. The sharp Hubble images provide information about the galaxy’s diameter and stellar density. Gemini measures the stellar motions as affected by the black hole’s pull. These data are used to calculate the mass of the black hole.


Black holes are gravitationally collapsed, ultra-compact objects that have a gravitational pull so strong that even light cannot escape. Supermassive black holes -- those with the mass of at least one million stars like our sun -- are thought to be at the centers of many galaxies.


The black hole at the center of our Milky Way galaxy has the mass of four million suns. As heavy as that is, it is less than 0.01 percent of the Milky Way’s total mass. By comparison, the supermassive black hole at the center of M60-UCD1, which has the mass of 21 million suns, is a stunning 15 percent of the small galaxy’s total mass.


“That is pretty amazing, given that the Milky Way is 500 times larger and more than 1,000 times heavier than the dwarf galaxy M60-UCD1,” Seth said.


One explanation is that M60-UCD1 was once a large galaxy containing 10 billion stars, but then it passed very close to the center of an even larger galaxy, M60, and in that process all the stars and dark matter in the outer part of the galaxy were torn away and became part of M60.


The team believes that M60-UCD1 may eventually be pulled to fully merge with M60, which has its own monster black hole that weighs a whopping 4.5 billion solar masses, or more than 1,000 times bigger than the black hole in our galaxy. When that happens, the black holes in both galaxies also likely will merge. Both galaxies are 50 million light-years away.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

WIRED: Radical New Theory Could Kill the Multiverse Hypothesis and Gets Rid of Concepts Like "Length" and "Mass"

WIRED: Radical New Theory Could Kill the Multiverse Hypothesis and Gets Rid of Concepts Like "Length" and "Mass" | Amazing Science | Scoop.it
Mass and length may not be fundamental properties of nature, according to new ideas bubbling out of the multiverse.


Though galaxies look larger than atoms and elephants appear to outweigh ants, some physicists have begun to suspect that size differences are illusory. Perhaps the fundamental description of the universe does not include the concepts of “mass” and “length,” implying that at its core, nature lacks a sense of scale.


This little-explored idea, known as scale symmetry, constitutes a radical departure from long-standing assumptions about how elementary particles acquire their properties. But it has recently emerged as a common theme of numerous talks and papers by respected particle physicists. With their field stuck at a nasty impasse, the researchers have returned to the master equations that describe the known particles and their interactions, and are asking: What happens when you erase the terms in the equations having to do with mass and length?


Nature, at the deepest level, may not differentiate between scales. With scale symmetry, physicists start with a basic equation that sets forth a massless collection of particles, each a unique confluence of characteristics such as whether it is matter or antimatter and has positive or negative electric charge. As these particles attract and repel one another and the effects of their interactions cascade like dominoes through the calculations, scale symmetry “breaks,” and masses and lengths spontaneously arise.


Similar dynamical effects generate 99 percent of the mass in the visible universe. Protons and neutrons are amalgams — each one a trio of lightweight elementary particles called quarks. The energy used to hold these quarks together gives them a combined mass that is around 100 times more than the sum of the parts. “Most of the mass that we see is generated in this way, so we are interested in seeing if it’s possible to generate all mass in this way,” said Alberto Salvio, a particle physicist at the Autonomous University of Madrid and the co-author of a recent paper on a scale-symmetric theory of nature.


In the equations of the “Standard Model” of particle physics, only a particle discovered in 2012, called the Higgs boson, comes equipped with mass from the get-go. According to a theory developed 50 years ago by the British physicist Peter Higgs and associates, it doles out mass to other elementary particles through its interactions with them. Electrons, W and Z bosons, individual quarks and so on: All their masses are believed to derive from the Higgs boson — and, in a feedback effect, they simultaneously dial the Higgs mass up or down, too.


The new scale symmetry approach rewrites the beginning of that story. “The idea is that maybe even the Higgs mass is not really there,” said Alessandro Strumia, a particle physicist at the University of Pisa in Italy. “It can be understood with some dynamics.”


The concept seems far-fetched, but it is garnering interest at a time of widespread soul-searching in the field. When the Large Hadron Collider at CERN Laboratory in Geneva closed down for upgrades in early 2013, its collisions had failed to yield any of dozens of particles that many theorists had included in their equations for more than 30 years. The grand flop suggests that researchers may have taken a wrong turn decades ago in their understanding of how to calculate the masses of particles.


“We’re not in a position where we can afford to be particularly arrogant about our understanding of what the laws of nature must look like,” said Michael Dine, a professor of physics at the University of California, Santa Cruz, who has been following the new work on scale symmetry. “Things that I might have been skeptical about before, I’m willing to entertain.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Open access to the universe: Scientists generated a giant cosmic simulation and give it away for free

Open access to the universe: Scientists generated a giant cosmic simulation and give it away for free | Amazing Science | Scoop.it

A small team of astrophysicists and computer scientists have created some of the highest-resolution snapshots yet of a cyber version of our own cosmos. Called the Dark Sky Simulations, they’re among a handful of recent simulations that use more than 1 trillion virtual particles as stand-ins for all the dark matter that scientists think our universe contains.


They’re also the first trillion-particle simulations to be made publicly available, not only to other astrophysicists and cosmologists to use for their own research, but to everyone. The Dark Sky Simulations can now be accessed through a visualization program in coLaboratory, a newly announced tool created by Google and Project Jupyter that allows multiple people to analyze data at the same time.


To make such a giant simulation, the collaboration needed time on a supercomputer. Despite fierce competition, the group won 80 million computing hours on Oak Ridge National Laboratory’s Titan through the Department of Energy’s 2014 INCITE program.


In mid-April, the group turned Titan loose. For more than 33 hours, they used two-thirds of one of the world’s largest and fastest supercomputers to direct a trillion virtual particles to follow the laws of gravity as translated to computer code, set in a universe that expanded the way cosmologists believe ours has for the past 13.7 billion years.


“This simulation ran continuously for almost two days, and then it was done,” says Michael Warren, a scientist in the Theoretical Astrophysics Group at Los Alamos National Laboratory. Warren has been working on the code underlying the simulations for two decades. “I haven’t worked that hard since I was a grad student.”


Back in his grad school days, Warren says, simulations with millions of particles were considered cutting-edge. But as computing power has increased, particle counts did too. “They were doubling every 18 months. We essentially kept pace with Moore’s Law.”


When planning such a simulation, scientists make two primary choices: the volume of space to simulate and the number of particles to use. The more particles added to a given volume, the smaller the objects that can be simulated—but the more processing power needed to do it.


Current galaxy surveys such as the Dark Energy Survey are mapping out large volumes of space but also discovering small objects. The under-construction Large Synoptic Survey Telescope “will map half the sky and can detect a galaxy like our own up to 7 billion years in the past,” says Risa Wechsler, Skillman’s colleague at KIPAC who also worked on the simulation. “We wanted to create a simulation that a survey like LSST would be able to compare their observations against.”


The time the group was awarded on Titan made it possible for them to run something of a Goldilocks simulation, says Sam Skillman, a postdoctoral researcher with the Kavli Institute for Particle Astrophysics and Cosmology, a joint institute of Stanford and SLAC National Accelerator Laboratory. “We could model a very large volume of the universe, but still have enough resolution to follow the growth of clusters of galaxies.”


The end result of the mid-April run was 500 trillion bytes of simulation data. Then it was time for the team to fulfill the second half of their proposal: They had to give it away.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Was a 3D-black hole from the surface of a 4D-universe the creator of our universe?

Was a 3D-black hole from the surface of a 4D-universe the creator of our universe? | Amazing Science | Scoop.it
Our universe may have emerged from a black hole in a higher-dimensional universe, propose a trio of Perimeter Institute researchers.


The big bang poses a big question: if it was indeed the cataclysm that blasted our universe into existence 13.7 billion years ago, what sparked it?


Three Perimeter Institute researchers have a new idea about what might have come before the big bang. It's a bit perplexing, but it is grounded in sound mathematics, testable, and enticing enough to earn the cover story in Scientific American, called "The Black Hole at the Beginning of Time." What we perceive as the big bang, they argue, could be the three-dimensional "mirage" of a collapsing star in a universe profoundly different than our own.


"Cosmology's greatest challenge is understanding the big bang itself," write Perimeter Institute Associate Faculty member Niayesh Afshordi, Affiliate Faculty member and University of Waterloo professor Robert Mann, and PhD student Razieh Pourhasan. Conventional understanding holds that the big bang began with a singularity – an unfathomably hot and dense phenomenon of spacetime where the standard laws of physics break down. Singularities are bizarre, and our understanding of them is limited. "For all physicists know, dragons could have come flying out of the singularity," Afshordi says in an interview with Nature.


In our three-dimensional universe, black holes have two-dimensional event horizons – that is, they are surrounded by a two-dimensional boundary that marks the "point of no return." In the case of a four-dimensional universe, a black hole would have a three-dimensional event horizon. In their proposed scenario, our universe was never inside the singularity; rather, it came into being outside an event horizon, protected from the singularity. It originated as – and remains – just one feature in the imploded wreck of a four-dimensional star.


The researchers emphasize that this idea, though it may sound "absurd," is grounded firmly in the best modern mathematics describing space and time. Specifically, they've used the tools of holography to "turn the big bang into a cosmic mirage." Along the way, their model appears to address long-standing cosmological puzzles and – crucially – produce testable predictions. Of course, our intuition tends to recoil at the idea that everything and everyone we know emerged from the event horizon of a single four-dimensional black hole. We have no concept of what a four-dimensional universe might look like. We don't know how a four-dimensional "parent" universe itself came to be.


But our fallible human intuitions, the researchers argue, evolved in a three-dimensional world that may only reveal shadows of reality. They draw a parallel to Plato's allegory of the cave, in which prisoners spend their lives seeing only the flickering shadows cast by a fire on a cavern wall.


"Their shackles have prevented them from perceiving the true world, a realm with one additional dimension," they write. "Plato's prisoners didn't understand the powers behind the sun, just as we don't understand the four-dimensional bulk universe. But at least they knew where to look for answers."

more...
Vloasis's curator insight, August 11, 11:00 PM

Maybe an alien experiment from another dimension went horribly awry and created our universe. Like, they were trying to find a new way to bomb the shit out of each other, and instead created a new existence. Or perhaps it was all too successful and rent open their world to create ours.

Eric Chan Wei Chiang's curator insight, August 12, 12:30 AM

This is a fairly long scoop that would appeal to theologist from various faiths. It provides a scientific explanation for the "original mover" of Abrahamic faiths. The higher realms of existence would appeal to followers of Hinduism and Buddhism.

 

Other scoops related to cosmology can be read here:

http://www.scoop.it/t/world-of-tomorrow/?tag=Cosmology

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Is The Universe A Multiverse Bubble? Physicists Are Trying To Bring It Into The Realm Of Testable Science

Is The Universe A Multiverse Bubble? Physicists Are Trying To Bring It Into The Realm Of Testable Science | Amazing Science | Scoop.it

Never mind the big bang; in the beginning was the vacuum. The vacuum simmered with energy (variously called dark energy, vacuum energy, the inflation field, or the Higgs field). Like water in a pot, this high energy began to evaporate – bubbles formed.


Each bubble contained another vacuum, whose energy was lower, but still not nothing. This energy drove the bubbles to expand. Inevitably, some bubbles bumped into each other. It’s possible some produced secondary bubbles. Maybe the bubbles were rare and far apart; maybe they were packed close as foam.


Proponents of the multiverse theory argue that it’s the next logical step in the inflation story. Detractors argue that it is not physics, but metaphysics – that it is not science because it cannot be tested. After all, physics lives or dies by data that can be gathered and predictions that can be checked.


That’s where Perimeter Associate Faculty member Matthew Johnson comes in. Working with a small team that also includes Perimeter Faculty member Luis Lehner, Johnson is working to bring the multiverse hypothesis firmly into the realm of testable science.

“That’s what this research program is all about,” he says. “We’re trying to find out what the testable predictions of this picture would be, and then going out and looking for them.”


Specifically, Johnson has been considering the rare cases in which our bubble universe might collide with another bubble universe. He lays out the steps: “We simulate the whole universe. We start with a multiverse that has two bubbles in it, we collide the bubbles on a computer to figure out what happens, and then we stick a virtual observer in various places and ask what that observer would see from there.”


“Simulating the universe is easy,” says Johnson. Simulations, he explains, are not accounting for every atom, every star, or every galaxy – in fact, they account for none of them. “We’re simulating things only on the largest scales,” he says. “All I need is gravity and the stuff that makes these bubbles up. We’re now at the point where if you have a favourite model of the multiverse, I can stick it on a computer and tell you what you should see.”


That’s a small step for a computer simulation program, but a giant leap for the field of multiverse cosmology. By producing testable predictions, the multiverse model has crossed the line between appealing story and real science.


In fact, Johnson says, the program has reached the point where it can rule out certain models of the multiverse: “We’re now able to say that some models predict something that we should be able to see, and since we don’t in fact see it, we can rule those models out.”


For instance, collisions of one bubble universe with another would leave what Johnson calls “a disk on the sky” – a circular bruise in the cosmic microwave background. That the search for such a disk has so far come up empty makes certain collision-filled models less likely.


Meanwhile, the team is at work figuring out what other kinds of evidence a bubble collision might leave behind. It’s the first time, the team writes in their paper, that anyone has produced a direct quantitative set of predictions for the observable signatures of bubble collisions. And though none of those signatures has so far been found, some of them are possible to look for.


The real significance of this work is as a proof of principle: it shows that the multiverse can be testable. In other words, if we are living in a bubble universe, we might actually be able to tell.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

First Evidence Of A Correction To The Speed of Light

First Evidence Of A Correction To The Speed of Light | Amazing Science | Scoop.it

When astronomers first observed light from a supernova arriving 7.7 hours after the neutrinos from the same event, they ignored the evidence. Now one physicist says the speed of light must be slower than Einstein predicted and has developed a theory that explains why.


In the early hours of the morning on 24 February 1987, a neutrino detector deep beneath Mont Blanc in northern Italy picked up a sudden burst of neutrinos. Three hours later, neutrino detectors at two other locations picked up a similar burst. The event consisted of two bursts of neutrinos separated by three hours followed by the first optical signals 4.7 hours later.


Some 4.7 hours after this, astronomers studying the Large Magellanic cloud that orbits our galaxy, noticed the tell-tale brightening of a blue supergiant star called Sanduleak -69 202, as it became a supernova. Since then, SN 1987a, as it was designated, has become one of the most widely studied supernovas in history.


Neutrinos and photons both travel at the speed of light and should therefore arrive simultaneously, all else being equal. The mystery is what caused this huge delay of 7.7 hours between the first burst of neutrinos and the arrival of the optical photons.


Today, we get an answer thanks to the work of James Franson at the University of Maryland in Baltimore. Franson has used the laws of quantum mechanics to calculate the speed of light travelling through a gravitational potential related to the mass of the Milky Way.


Because all previous speed-of-light calculations have relied only on general relativity, they do not take into account the tiny effects of quantum mechanics. But these effects are significant over such long distances and through such a large mass as the Milky Way, says Franson.


He says that quantum mechanical effects should slow down light in these kinds of circumstances and calculates that this more or less exactly accounts for the observed delay.


First, some background about the mechanism behind the supernova. A supernova begins with the collapse of a star’s core, generating both neutrinos and optical photons. However, the density of the core delays the emergence of the photons by about 3 hours. By contrast, the neutrinos interact less strongly with matter and so emerge unscathed more or less immediately.

more...
Carlos Garcia Pando's comment, June 27, 1:51 AM
"They ignored the evidence" Interesting. Why? Because the observation did not fit into the theory. Theory was their religion and couldn't be denied by facts.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Powerful lasers create table-top supernova

Powerful lasers create table-top supernova | Amazing Science | Scoop.it
Laser beams 60,000 billion times more powerful than a laser pointer have been used to recreate scaled supernova explosions in the laboratory as a way of investigating one of the most energetic events in the Universe.


To recreate a supernova explosion in the laboratory the team used the Vulcan laser facility at the UK's Science and Technology Facilities Council's Rutherford Appleton Lab. 'Our team began by focusing three laser beams onto a carbon rod target, not much thicker than a strand of hair, in a low density gas-filled chamber,' said Ms Jena Meinecke an Oxford University graduate student, who headed the experimental efforts.  The enormous amount of heat generated more than a few million degrees Celsius by the laser caused the rod to explode creating a blast that expanded out through the low density gas. In the experiments the dense gas clumps or gas clouds that surround an exploding star were simulated by introducing a plastic grid to disturb the shock front. 


'The experiment demonstrated that as the blast of the explosion passes through the grid it becomes irregular and turbulent just like the images from Cassiopeia,' said Professor Gregori. 'We found that the magnetic field is higher with the grid than without it. Since higher magnetic fields imply a more efficient generation of radio and X-ray photons, this result confirms that the idea that supernova explosions expand into uniformly distributed interstellar material isn't always correct and it is consistent with both observations and numerical models of a shockwave passing through a 'clumpy' medium.' 


'Magnetic fields are ubiquitous in the universe,' said Don Lamb, the Robert A. Millikan Distinguished Service Professor in Astronomy & Astrophysics at the University of Chicago. 'We're pretty sure that the fields didn't exist at the beginning, at the Big Bang. So there’s this fundamental question: how did magnetic fields arise?' These results are significant because they help to piece together a story for the creation and development of magnetic fields in our Universe, and provide the first experimental proof that turbulence amplifies magnetic fields in the tenuous interstellar plasma.

The advance was made possible by the extraordinarily close cooperation between the teams performing the experiments and the computer simulations. 'The experimentalists knew all the physical variables at a given point. They knew exactly the temperature, the density, the velocities,' said Petros Tzeferacos of the University of Chicago, a study co-author. 'This allows us to benchmark the code against something that we can see.' Such benchmarking – called validation – shows that the simulations can reproduce the experimental data. The simulations consumed 20 million processing hours on supercomputers at Argonne National Laboratory, in the USA. 
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Binary Supermassive Black Hole System Discovered

Binary Supermassive Black Hole System Discovered | Amazing Science | Scoop.it

Astronomers using XMM-Newton have discovered, for the first time, a pair of supermassive black holes in orbit around one another in an ordinary looking galaxy.


Most massive galaxies in the Universe are thought to harbor at least one supermassive black hole at their center. Two supermassive black holes are the smoking gun that the galaxy has merged with another. Thus, finding binary supermassive black holes can tell astronomers about how galaxies evolved into their present-day shapes and sizes.


To date, only a few candidates for close binary supermassive black holes have been found. All are in active galaxies where they are constantly ripping gas clouds apart, in the prelude to crushing them out of existence. In the process of destruction, the gas is heated so much that it shines at many wavelengths, including X-rays. This gives the galaxy an unusually bright centre, and leads to it being called active.


On 10 June 2010, Dr Fukun Liu from Peking University in China with colleagues spotted a tidal disruption event in the galaxy SDSS J120136.02+300305.5 (J120136 for short). They were scanning the data for such events and scheduled follow-up observations just days later with XMM-Newton and NASA’s Swift satellite.


The galaxy was still spilling X-rays into space. It looked exactly like a tidal disruption event caused by a supermassive black hole but as they tracked the slowly fading emission day after day something strange happened. The X-rays fell below detectable levels between days 27 and 48 after the discovery. Then they re-appeared and continued to follow a more expected fading rate, as if nothing had happened.


“This is exactly what you would expect from a pair of supermassive black holes orbiting one another,” said Dr Liu, who is the lead author of the study published in the Astrophysical Journal (arXiv.org version). Dr Liu found that two possible configurations were possible to reproduce the observations of J120136.


In the first, the primary black hole contained 10 million solar masses and was orbited by a black hole of about a million solar masses in an elliptical orbit. In the second solution, the primary black hole was about a million solar masses and in a circular orbit. In both cases, the separation between the black holes was relatively small – about 2 thousandths of a light year. This is about the width of our Solar System.

more...
SPRO's curator insight, May 25, 7:37 AM

Stunning!

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Black holes, fate of quantum information, and optimal quantum cloning machines

Black holes, fate of quantum information, and optimal quantum cloning machines | Amazing Science | Scoop.it

The fate of classical information incident on a quantum black hole has been the subject of an ongoing controversy in theoretical physics, because a calculation within the framework of semi-classical curved-space quantum field theory appears to show that the incident information is irretrievably lost, in contradiction to time-honored principles such as time-reversibility and unitarity. Within this framework embedded in quantum communication theory that signaling from past to future infinity in the presence of a Schwarzschild black hole can occur with arbitrary accuracy, and thus that classical information is not lost in black hole dynamics. The calculation relies on a treatment that is manifestly unitary from the outset, where probability conservation is guaranteed because black holes stimulate the emission of radiation in response to infalling matter. This stimulated radiation is non-thermal, and contains all of the information about the infalling matter, while Hawking radiation contains none of it.


Lenny Susskind writes in his book "The Black Hole War" that he proposed (in front of Sid Coleman and Stephen Hawking) that the problem would be solved if "the region just outside the horizon is occupied by a lot of tiny invisible Xerox machines" [6, p. 227]. But he then immediately retreated from this idea, because he thought it would violate the no-cloning theorem (Which we now know it does not)Susskind later revived the idea in his "black hole complementarity" proposal, claiming that somehow information would both fall into the black hole and be reflected at the horizon, but that the no-cloning theorem would not be violated because nobody would ever know (as you can't make an experiment both inside and outside of the black hole). This idea is based on a profound misunderstanding of quantum cloning, and in particular its relation to stimulated emission of radiation.


Further reading

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

▶ Monster Black Holes and the Passage of Time ! 2014 New Documentary

Why do some stars end up as black holes? [Or,] What does the exclusion principle have to do with whether or not a star becomes a black hole?


How is time changed in a black hole?


Does the E=mc^2 equation apply to a black hole?


If nothing travels at the speed of light, except light, how can a black hole also pull light into itself?


What is the best evidence for the existence of black holes? Is it all really just a theory?


I've heard that a black hole 'belches' light and radiation whenever something falls into its event horizon. What does that mean and why does that happen?


Can you see a black hole? What does a black hole look like?


How big can a black hole get?


How small can a black hole be?


[In reference to the answer to question 1 above.] Why don't the internal electron forces of a star increase at the same rate as gravitational forces?


Will an observer falling into a black hole be able to witness all future events in the universe outside the black hole?


Could black holes be used as an energy source?


I read somewhere that in the VERY distant future black holes could leak and disperse. Can that happen? If it can, how?

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Astronomers claim to have the most compelling case for annihilating dark matter yet

Astronomers claim to have the most compelling case for annihilating dark matter yet | Amazing Science | Scoop.it

Dark matter is arguably one of the universe’s most perplexing mysteries. Astronomers have gathered overwhelming evidence that it makes up roughly 84% of the universe's matter. Its extra gravity provides the most straightforward explanation for the rotations of individual galaxies, the motions of distant galaxy clusters, and the bending of distant starlight. 

So what is this elusive matter? A popular theory is that it consists of a yet-undiscovered exotic massive particle that barely interacts with normal matter. These particles have so far eluded detection. But theoretically they act as their own antiparticles, and can annihilate to produce a cascade of familiar particles, including electrons and positrons. The collision should generate gamma-rays — the most energetic photons in nature.

NASA’s Fermi Gamma-Ray Space Telescope has been scouring the sky in search of this tell-tale annihilation signature since its launch in 2008. While the telescope has spotted a large number of gamma rays pouring outward from the center of our galaxy, astronomers have not been able to determine if this detection is due to dark matter annihilation or other natural particle accelerators. 

The most likely culprits for the latter alternative are undetected pulsars. These rotating neutron stars beam huge amounts of energy out of their poles, including matter-antimatter pairs that can annihilate in bursts of gamma rays.

A team of astronomers led by Tansu Daylan (Harvard University) has further scrutinized the excess Fermi signal, and has ruled out pulsars as the cause. This leads to the conclusion that the signal must be due to annihilating dark matter — a claim that would resolve one of the biggest mysteries in physics. 

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Dark matter looks more and more likely after new gamma-ray analysis

Dark matter looks more and more likely after new gamma-ray analysis | Amazing Science | Scoop.it
Scientists describe as 'extremely interesting' new analysis that makes case for gamma rays tracing back to Wimp particles


Not long after the Fermi Gamma-ray SpaceTelescope took to the sky in 2008, astrophysicists noticed that it was picking up a steady rain of gamma rays pouring outward from the center of the Milky Way galaxy.


This high-energy radiation was consistent with the detritus of annihilating dark matter, the unidentified particles that constitute 84% of the matter in the universe and that fizzle upon contact with each other, spewing other particles as they go. If the gamma rays did in fact come from dark matter, they would reveal its identity, resolving one of the biggest mysteries in physics. But some argued that the gamma rays could have originated from another source.


Now a new analysis of the signal claims to rule out all other plausible explanations and makes the case that the gamma rays trace back to a type of particle that has long been considered the leading dark matter candidate – a weakly interacting massive particle, or Wimp. Meanwhile, a more tentative X-ray signal reported in two other new studies suggests the existence of yet another kind of dark matter particle called a sterile neutrino.


In the new gamma-ray analysis, which appeared February 27 on the scientific preprint site arXiv.org, Dan Hooper and his collaborators used more than five years' worth of the cleanest Fermi data to generate a high-resolution map of the gamma-ray excess extending from the center of the galaxy outward at least 10 angular degrees, or 5,000 light-years, in all directions.


"The results are extremely interesting," said Kevork Abazajian, an associate professor of physics and astronomy at the University of California, Irvine. "The most remarkable part of the analysis is that the signal follows the shape of the dark matter profile out to 10 degrees," he said, explaining that it would be "very difficult to impossible" for other sources to mimic this predicted dark matter distribution over such a broad range.


The findings do not constitute a discovery of dark matter, the scientists said, but they prepare the way for an upcoming test described by many researchers as a "smoking gun": If the gamma-ray excess comes from annihilating Wimps, and not conventional astrophysical objects, then the signal will also be seen emanating from dwarf galaxies that orbit the Milky Way – diffuse objects that are rich in dark matter but not in other high-energy photon sources such as pulsars, rotating neutron stars that have been floated as alternative explanations for the excess.

more...
Eli Levine's curator insight, March 16, 10:50 AM

The more we know about this place we call "the universe" the more likely we'll be able to understand ourselves and to put our minds to ease about a great many of our pressing questions.

 

The more we know about this place, the more likely it will be that we're able to alleviate suffering, misery and, hopefully, bring a greater quality of life and sense of peace to all of our minds.

 

Assuming that each of our particular minds are willing and able to accept the truths of this world in the first place.

 

Think about it.