Amazing Science
1.1M views | +41 today
Follow
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Track Elon Musk's Tesla Roadster in Space!

Track Elon Musk's Tesla Roadster in Space! | Amazing Science | Scoop.it

On February 6, 2018, SpaceX launched a Tesla roadster and a spacesuit-clad mannequin into space onboard its Falcon Heavy rocket. Presuming that the harshness of space hasn't decimated the duo, the question is, where is the payload five years since its launch? The answer will blow your mind.

 

Back in 2018, SpaceX had just begun to demonstrate its reliability as a launch provider shoring up newly awarded contracts. However, its Falcon Heavy rocket was still in the works, and SpaceX needed a dummy payload to test it. This became the genesis of Starman, a mannequin in a space suit, who occupied the driver's seat on the red Tesla roadster.

 

But where is the vehicle now? The current location is 203,029,386 miles (326,744,225 km, 2.184 AU, 18.17 light minutes) from Earth, moving toward Earth at a speed of 7,105 mi/h (11,435 km/h, 3.18 km/s).

The car is 280,838,746 miles (451,966,291 km, 3.021 AU, 25.13 light minutes) from Mars, moving away from the planet at a speed of 16,314 mi/h (26,254 km/h, 7.29 km/s).

The car is 137,142,121 miles (220,708,917 km, 1.475 AU, 12.27 light minutes) from the Sun, moving away from the star at a speed of 12,014 mi/h (19,335 km/h, 5.37 km/s).

The car has exceeded its 36,000 mile warranty 70,125.8 times while driving around the Sun, (2,524,528,836 miles, 4,062,836,592 km, 27.16 AU) moving at a speed of 51,712 mi/h (83,222 km/h, 23.12 km/s). The orbital period is about 557 days.

It has achieved a fuel economy of 20,035.9 miles per gallon (8,518.2 km/liter, 0.01174 liters/100 km), assuming 126,000 gallons of fuel.

If the battery was still working, Starman has listened to Space Oddity 496,721 times since he launched in one ear, and to Is there Life On Mars? 669,310 times in his other ear.

Starman has completed about 3.2823 orbits around the Sun since launch. A telescope about 47,954 ft (14,616 m) in diameter would be required to resolve the Upper stage from Earth. A smaller one could see him as an unresolved dot, about 92.2 ft (28.1 m) in diameter, in ideal conditions.

The vehicle has traveled far enough to drive all of the world’s roads 63.2 times. It has been 5 years, 1 day and 21 hours since launch.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Gama - Enabling Deep Space Exploration: The first European Solar Sail Gets Launched

Gama - Enabling Deep Space Exploration: The first European Solar Sail Gets Launched | Amazing Science | Scoop.it

The idea started as an essay 20 years ago. This essay was about the principle of photonic propulsion, the idea that it is possible to move space vehicles thanks to sunlight. When a photon reflects on a surface, it transfers a small momentum, exerting pressure on it. Deploying a huge mirror in space would would be all you need to use this photonic pressure to move a vehicle.

 

This is the principle of the solar sail and the subject of the Gama project which started in August 2020. A solar sail could theoretically accelerate to speeds never before achieved by objects created by humanity. This efficient and economical propulsion method could offer new opportunities in terms of space missions.

 

And today is an exceptional day for this project and for the whole team. The satellite which is the size of 5 milk bottles (6U) and weighs 12 kg is going to be put on a low earth orbit at 550 km (LEO). It embeds both a "Bus" part containing all the system elements (energy, communication) developed by

 

NanoAvionics and a "Sail" part on which we have concentrated our efforts. The sail is made of a reflective material (aluminized polyimide: CP1) extremely thin (2.5 microns). It is folded origami-style in a winder and is equipped with 4 Tungsten weights fixed at its ends.

 

When the satellite leaves the Falcon 9 rocket, a first test phase will begin, followed by a deployment phase. A slow rotation of several days will allow to unfold the 73.3 m2 of the sail. It is the centrifugal force which will make it possible to unroll and to rigidify the sail.

 

This first mission will simply allow the scientific team to test the deployment of the sail. At this altitude, there is still too much air slowing down the sail and making impossible to navigate it. They will have to wait for our next mission "Gama Beta" in 2024 and the deployment of the sail in high orbit to be able to test the navigation to make the satellite move thanks to the photonic pressure. From that moment on, the scientists will judge whether the technology is mature enough for its commercialization for deep space exploration and exploitation missions.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Microwave deep drilling is set to unlock near-limitless ultra-deep geothermal energy

Microwave deep drilling is set to unlock near-limitless ultra-deep geothermal energy | Amazing Science | Scoop.it

MIT spin-off Quaise says it's going to use hijacked fusion technology to drill the deepest holes in history, unlocking clean, virtually limitless, supercritical geothermal energy that can re-power fossil-fueled power plants all over the world. Using millimeter wave drilling systems, Quaise Energy states their technology can drill up to 20 kilometers (12.4 miles) deep and harness the "virtually unlimited" amount of heat found below.  The firm has now sourced $40 million in Series A funding for their venture.  “A rapid transition to clean energy is one of the biggest challenges faced by humanity,” said Arunas Chesonis, Managing Partner of Safar Partners, who are leading the financing round, in a statement

 

Everyone knows the Earth's core is hot, but maybe the scale of it still has the power to surprise. Temperatures in the iron center of the core are estimated to be around 5,200 °C (9,392 °F), generated by heat from radioactive elements decaying combining with heat that still remains from the very formation of the planet – an event of cataclysmic violence when a swirling cloud of gas and dust was crushed into a ball by its own gravity.

 

Where there's access to heat, there's harvestable geothermal energy. And there's so much heat below the Earth's surface, according to Paul Woskov, a senior fusion research engineer at MIT, that tapping just 0.1 percent of it could supply the entire world's energy needs for more than 20 million years.

 
The problem is access. Where subterranean heat sources naturally occur close to the surface, easily accessible and close enough to a relevant power grid for economically viable transmission, geothermal becomes a rare example of totally reliable, round-the-clock green power generation. The Sun stops shining, the wind stops blowing, but the rock's always hot. Of course, these conditions are fairly rare, and as a result, geothermal currently supplies only around 0.3 percent of global energy consumption.

 

The deepest human drill holes are not deep enough

If we could drill deep enough, we could put geothermal power stations just about anywhere we wanted them. But that's harder than it sounds. The Earth's crust varies in thickness between about 5-75 km (3-47 miles), with the thinnest parts tending to be way out in the deep ocean.

 

The deepest hole humanity has ever managed to drill is the Kola Superdeep Borehole. This Russian project near the Norwegian border struck out in 1970, aiming to puncture the crust right down to the mantle, and one of its bore holes reached a vertical depth of 12,289 m (40,318 ft) in 1989, before the team decided it was unfeasible to go any deeper, and ran out of money. At that depth, the Kola team members expected the temperature to be somewhere around 100 °C (212 °F), but in reality they found it was closer to 180 °C (356 °F). The rock was less dense and more porous than expected, and these factors combined with the elevated heat to create nightmare drilling conditions. The Kola site has fallen into complete disrepair, and this "entrance to hell," a pinnacle (or perhaps nadir) of human achievement, is now an anonymous, welded-shut hole.

 

Conventional drills glide easily through rock and minerals found at shallow depths in the crust but get deeper and hard rock, extreme pressure, and increases in temperature make drill bits useless. To create a material that can withstand such conditions would be expensive to the point of redundancy, and so drilling has remained strictly near the surface.  Enter Quaise Energy’s "gyrotron-powered drilling platform". The company will use conventional drilling to reach basement rock, before using their new platform that directs high energy waves downward using a long guide. There is no conventional drill bit to melt, and the waves should be able to handle the dense and hot rock found in the depths below. Quaise's millimeter wave drilling system uses a technology that has been theorized for almost a decade, but is yet to be produced at scale. Considered a form of "directed energy drilling" – which sounds like something ripped straight out of Star Trek – the technology involves using high-frequency waves to heat the rock in its path to such a temperature that it either melts or vaporizes. It sounds sci-fi because it quite literally is – or so scientists thought. Quaise would then use this heat to "repower traditional power plants", removing the need for fossil fuels.

 

"Our technology allows us to access energy anywhere in the world, at a scale far greater than wind and solar, enabling future generations to thrive in a world powered with abundant clean energy" said Carlos Araque, CEO and co-founder of Quaise Energy, in a statement. To date, the company has understandably remained relatively quiet on the exact capabilities of its technology. As such, it is difficult to tell whether the technology will be viable at this stage, but millimeter wave drilling systems are not without their challenges. For one, they require large amounts of energy to produce sufficient directed energy to melt or vaporize rocks, which Quaise claims their gyrotron source is an efficient device for the job. There are also significant transmission losses of energy when high-frequency waves are sent over distance, with the high temperatures 10-20 kilometers (6.2-12.4 miles) into the crust also affecting the transmission efficiency.

 

It remains to be seen whether Quaise can deliver on their promises – they aim to build functional drilling machines by 2024  – but if they can, it would be a geothermal breakthrough that would send ripples through the world of clean energy. 

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

MinD-Vis: Decoding visual stimuli from brain recordings with high accuracy

MinD-Vis: Decoding visual stimuli from brain recordings with high accuracy | Amazing Science | Scoop.it

For the first time, we show that non-invasive brain recordings can be used to decode images with similar performance as invasive measures.

 

Decoding visual stimuli from brain recordings aims to deepen our understanding of the human visual system and build a solid foundation for bridging human vision and computer vision through the Brain-Computer Interface. However, due to the scarcity of data annotations and the complexity of underlying brain information, it is challenging to decode images with faithful details and meaningful semantics.

In this work, AI scientists present MinD-Vis: Sparse Masked Brain Modeling with Double-Conditioned Diffusion Model for Vision Decoding. Specifically, by boosting the information capacity of representations learned in a large-scale resting-state fMRI dataset, they were able to show that the MinD-Vis framework reconstructed highly plausible images with semantically matching details from brain recordings with very few training pairs. The benchmarked model and its correlated method outperformed state-of-the-arts in both semantic mapping (100-way semantic classification) and generation quality (FID) by 66% and 41%, respectively. Exhaustive ablation studies are conducted to analyze this framework.

A human visual decoding system that only reply on limited annotations. State-of-the-art 100-way top-1 classification accuracy on GOD dataset: 23.9%, outperforming the previous best by 66%. State-of-the-art generation quality (FID) on GOD dataset: 1.67, outperforming the previous best by 41%.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Putting the brakes on lithium-ion batteries to prevent fires

Putting the brakes on lithium-ion batteries to prevent fires | Amazing Science | Scoop.it

Lithium-ion (Li-ion) batteries are used to power everything from smart watches to electric vehicles, thanks to the large amounts of energy they can store in small spaces. When overheated, however, they’re prone to catching fire or even exploding. But recent research published in ACS’ Nano Letters offers a possible solution with a new technology that can swiftly put the brakes on a Li-ion battery, shutting it down when it gets too hot.

 

The chemistry found in many batteries is essentially the same: Electrons are shuttled through an electronic device in a circuit from one electrode in the battery to another. But in a Li-ion cell, the electrolyte liquid that separates these electrodes can evaporate when it overheats, causing a short circuit. In certain cases, short circuiting can lead to thermal runaway, a process in which a cell heats itself uncontrollably. When multiple Li-ion cells are chained together — such as in electric vehicles — thermal runaway can spread from one unit to the next, resulting in a very large, hard-to-fight fire. To prevent this, some batteries now have fail-safe features, such as external vents, temperature sensors or flame-retardant electrolytes. But these measures often either kick in too late or harm performance. So, Yapei Wang, Kai Liu and colleagues wanted to create a Li-ion battery that could shut itself down quickly, but also work just as well as existing technologies.

 

The researchers used a thermally-responsive shape memory polymer covered with a conductive copper spray to create a material that would transmit electrons most of the time, but switch to being an insulator when heated excessively. At around 197 ˚F, a microscopic, 3D pattern programmed into the polymer appeared, breaking apart the copper layer and stopping the flow of electrons. This permanently shut down the cell but prevented a potential fire. At this temperature, however, traditional cells kept running, putting them at risk of thermal runaway if they became hot again. Under regular operating temperatures, the battery with the new polymer maintained a high conductivity, low resistivity and similar cycling lifetime to a traditional battery cell. The researchers say that this technology could make Li-ion batteries safer without having to sacrifice their performance.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

A laser that could ‘reshape the landscape of integrated photonics’

A laser that could ‘reshape the landscape of integrated photonics’ | Amazing Science | Scoop.it

How do you integrate the advantages of a benchtop laser that fills a room onto a semiconductor chip the size of a fingernail? A research team co-led by Qiang Lin, a professor of electrical and computer engineering at the University of Rochester, has set new milestones in addressing this challenge, with the first multi-color integrated Pockels laser that:

  • Emits high-coherence light at telecommunication wavelengths
  • Allows laser-frequency tuning at record speeds
  • Is the first narrow linewidth laser with fast configurability at the visible band

 

The project, described in Nature Communications, was co-led by John Bowers, distinguished professor at University of California/Santa Barbara, and Kerry Vahala, professor at the California Institute of Technology. Lin Zhu, professor at Clemson University, also collaborated on the project. The technology “has the potential to reshape the landscape of integrated photonics,” write co-lead authors Mingxiao Li, a former PhD student in Lin’s Laboratory for Nanophotonics at Rochester’s Hajim School of Engineering & Applied Sciences, and Lin Chang, a former postdoctoral student at University of California/Santa Barbara. It will pave the way for new applications of integrated semiconductor lasers in LiDAR (Light Detection and Ranging) remote sensing that is used, for example, in self-driving cars. The technology could also lead to advances in microwave photonics, atomic physics, and AR/VR.

 

A ‘fully on-chip laser solution’

Integrated semiconductor lasers have been at the core of integrated photonics, enabling many advances over the last few decades in information technologies and basic science. “However, despite these impressive achievements, key functions are missing in current integrated lasers,” Li says. “Two major challenges, the lack of fast reconfigurability and the narrow spectral window, have become major bottlenecks that stall the progression of many evolving applications,” Chang adds.

 

The researchers say they’ve overcome these challenges by creating a new type of integrated semiconductor laser, based on the Pockels effect. The laser is integrated with a lithium-niobate- on-insulator platform.

 

The new technology includes these beneficial features:

  • Fast frequency chirping, which will be invaluable in LiDAR sensor systems, which measure distance by recording the time between emission of a short pulse and reception of reflected light.
  • Frequency conversion capabilities that overcome spectral bandwidth limitations of traditional integrated semiconductor lasers. This will “significantly relieve” the difficulties in developing new wavelength lasers.
  • Narrow wavelength and fast reconfigurability, providing a “fully on-chip laser solution” to probe and manipulate atoms and ions in atomic physics, and benefit AR/VR and other applications at short wavelengths.

 

The researchers see applications for a new type of integrated semiconductor laser in LiDAR, atomic physics, AR/VR.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Carnegie Mellon University: Smart headlights can see through rain and snow

Drivers can struggle to see when driving at night in a rainstorm or snowstorm, but a smart headlight system invented by researchers at Carnegie Mellon University's Robotics Institute can improve visibility by constantly redirecting light to shine between particles of precipitation. The system, demonstrated in laboratory tests, prevents the distracting and sometimes dangerous glare that occurs when headlight beams are reflected by precipitation back toward the driver.

 

The first headlights to adorn automobiles weren’t all that much better than squinting real hard and hoping any cows in the road had the good sense to move out of your way. The dim light cast by early kerosene oil and acetylene gas lamps made most travel after dark a fool’s errand. 

 

Today, of course, the latest generation of headlights work much like modern televisions with tightly packed arrays of pixelated lights blinking at up to 5,000 times a second, allowing drivers to essentially use high and low beams at the same time. Until very recently, however, cutting-edge features like that weren’t allowed on vehicles sold in the US due to an NHTSA regulation set in the 1960s. But thanks to a multi-year lobbying effort on the part of Toyota, those regulations changed this last February — now America’s roadways are about to become a bit brighter and a whole lot safer.

 

Following the short-lived idea of using open flames to light the way, the first electric headlights appeared on the 1912 Cadillac Model 30 and, by the next decade, were quickly becoming mandatory equipment across the nation. The first split-intensity headlights offering separate low and high beams were produced in 1915 but wouldn’t be included in a vehicle’s OEM design until in 1924 and the floor-mounted switch that controlled them wouldn’t be invented until three years after that — a full decade of having to get out of the car just to turn your lights on and blink between brightnesses!

 

The advent of sealed beam headlights with filaments for both low and high beams in 1954, and its widespread adoption by 1957, proved a massive technological leap. With low beams for dusk and evening driving, and high beams for late night travel on otherwise unlit roads, these new headlights would drastically extend the hours of day a car could safely be on the road.

 

The first halogen light, which would itself quickly become a global standard, debuted in 1962. But halogens at that time were about as popular in the US as the metric system — we still preferred tungsten incandescents. That changed with the passage of the Motor Vehicle Safety Act of 1966 and the formation of the National Highway Transportation Authority (the NHTSA) in 1968, which took the existing hodge-podge of state-level vehicular regulations and federalized them, as well as the formal adoption that year of Federal Motor Vehicle Safety Standard (FMVSS) 108, which dictated that all headlights be constructed of sealed beams.


Credit: Srinivasa Narasimhan/Carnegie Mellon University

Read more at Futurity: http://www.futurity.org

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Topological superconductivity: New hybrid structures could pave the way to more stable quantum computers

Topological superconductivity: New hybrid structures could pave the way to more stable quantum computers | Amazing Science | Scoop.it
A new way to explore topological superconductivity.

 

A new way to combine two materials with special electrical properties -- a monolayer superconductor and a topological insulator -- provides the best platform to date to explore an unusual form of superconductivity called topological superconductivity. The combination could provide the basis for topological quantum computers that are more stable than their traditional counterparts.

 

Superconductors -- used in powerful magnets, digital circuits, and imaging devices -- allow the electric current to pass without resistance, while topological insulators are thin films only a few atoms thick that restrict the movement of electrons to their edges, which can result in unique properties. A team led by researchers at Penn State describe how they have paired the two materials in a paper appearing Oct. 27th 2022 in the journal Nature Materials.

 

"The future of quantum computing depends on a kind of material that we call a topological superconductor, which can be formed by combining a topological insulator with a superconductor, but the actual process of combining these two materials is challenging," said Cui-Zu Chang, Henry W. Knerr Early Career Professor and Associate Professor of Physics at Penn State and leader of the research team. "In this study, we used a technique called molecular beam epitaxy to synthesize both topological insulator and superconductor films and create a two-dimensional heterostructure that is an excellent platform to explore the phenomenon of topological superconductivity."

 

In previous experiments to combine the two materials, the superconductivity in thin films usually disappears once a topological insulator layer is grown on top. Physicists have been able to add a topological insulator film onto a three-dimensional "bulk" superconductor and retain the properties of both materials. However, applications for topological superconductors, such as chips with low power consumption inside quantum computers or smartphones, would need to be two-dimensional.

 

In this paper, the research team stacked a topological insulator film made of bismuth selenide (Bi2Se3) with different thicknesses on a superconductor film made of monolayer niobium diselenide (NbSe2), resulting in a two-dimensional end-product. By synthesizing the heterostructures at very lower temperature, the team was able to retain both the topological and superconducting properties. "In superconductors, electrons form 'Cooper pairs' and can flow with zero resistance, but a strong magnetic field can break those pairs," said Hemian Yi, a postdoctoral scholar in the Chang Research Group at Penn State and the first author of the paper. "The monolayer superconductor film we used is known for its 'Ising-type superconductivity,' which means that the Cooper pairs are very robust against the in-plane magnetic fields. We would also expect the topological superconducting phase formed in our heterostructures to be robust in this way."

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Battery technology breakthrough that allows charging of EV batteries in just 10 minutes

Battery technology breakthrough that allows charging of EV batteries in just 10 minutes | Amazing Science | Scoop.it

A breakthrough in electric vehicle battery design has enabled a 10-minute charge time for a typical EV battery. The record-breaking combination of a shorter charge time and more energy acquired for longer travel range was announced today (Oct. 12, 2022) in Nature.

 

"The need for smaller, faster-charging batteries is greater than ever," said Chao-Yang Wang, the William E. Diefenderfer Professor of Mechanical Engineering at Penn State and lead author on the study. "There are simply not enough batteries and critical raw materials, especially those produced domestically, to meet anticipated demand."

 

In August 2022, California's Air Resources Board passed an extensive plan to restrict and ultimately ban the sale of gasoline-powered cars within the state. By 2035, the largest auto market in the United States will effectively retire the internal combustion engine. If new car sales are going to shift to battery-powered electric vehicles (EVs), Wang explained, they'll need to overcome two major drawbacks: they are too slow to recharge and too large to be efficient and affordable. Instead of taking a few minutes at the gas pump, depending on the battery, some EVs can take all day to recharge.

 

"Our fast-charging technology works for most energy-dense batteries and will open a new possibility to downsize electric vehicle batteries from 150 to 50 kWh without causing drivers to feel range anxiety," said Wang, whose lab partnered with State College-based startup EC Power to develop the technology. "The smaller, faster-charging batteries will dramatically cut down battery cost and usage of critical raw materials such as cobalt, graphite and lithium, enabling mass adoption of affordable electric cars."

 

The presented technology relies on internal thermal modulation, an active method of temperature control to demand the best performance possible from the battery, Wang explained. Batteries operate most efficiently when they are hot, but not too hot. Keeping batteries consistently at just the right temperature has been major challenge for battery engineers. Historically, they have relied on external, bulky heating and cooling systems to regulate battery temperature, which respond slowly and waste a lot of energy, Wang said.

 

Wang and his team decided to instead regulate the temperature from inside the battery. The researchers developed a new battery structure that adds an ultra-thin nickel foil as the fourth component besides anode, electrolyte and cathode. Acting as a stimulus, the nickel foil self-regulates the battery's temperature and reactivity which allows for 10-minute fast charging on just about any EV battery, Wang explained. "True fast-charging batteries would have immediate impact," the researchers write. "Since there are not enough raw minerals for every internal combustion engine car to be replaced by a 150 kWh-equipped EV, fast charging is imperative for EVs to go mainstream."

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Disposable electronics circuit board made of a sheet paper with fully integrated electrical components

Disposable electronics circuit board made of a sheet paper with fully integrated electrical components | Amazing Science | Scoop.it

Discarded electronic devices, such as cell phones, are a fast-growing source of waste. One way to mitigate the problem could be to use components that are made with renewable resources and that are easy to dispose of responsibly. Now, researchers reporting in ACS Applied Materials & Interfaces have created a prototype circuit board that is made of a sheet paper with fully integrated electrical components, and that can be burned or left to degrade.

 

Most small electronic devices contain circuit boards that are made from glass fibers, resins and metal wiring. These boards are not easy to recycle and are relatively bulky, making them undesirable for use in point-of-care medical devices, environmental monitors or personal wearable devices. One alternative is to use paper-based circuit boards, which should be easier to dispose of, less expensive and more flexible. However, current options require specialized paper, or they simply have traditional metal circuitry components mounted onto a sheet of paper. Instead, Choi and colleagues wanted to develop circuitry that would be simple to manufacture and that had all the electronic components fully integrated into the sheet.

 

The team designed a paper-based amplifier-type circuit that incorporated resistors, capacitors and a transistor. They first used wax to print channels onto a sheet of paper in a simple pattern. After melting the wax so that it soaked into the paper, the team printed semi-conductive and conductive inks, which soaked into the areas not blocked by wax. Then, the researchers screen-printed additional conductive metal components and casted a gel-based electrolyte onto the sheet.

 

Tests confirmed that the resistor, capacitor and transistor designs performed properly. The final circuit was very flexible and thin, just like paper, even after adding the components. To demonstrate the degradability of the circuit, the team showed that the entire unit quickly burned to ash after being lit on fire. The researchers say this represents a step toward producing completely disposable electronic devices.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Battery-free, wireless underwater camera developed using energy from sound waves

Battery-free, wireless underwater camera developed using energy from sound waves | Amazing Science | Scoop.it
MIT researchers built a battery-free, wireless underwater camera, powered by sound waves, that can take high-quality, color images, even in dark environments. It transmits image data through the open water to a receiver that reconstructs the color image.

 

Scientists estimate that more than 95 percent of Earth's oceans have never been observed, which means we have seen less of our planet's ocean than we have the far side of the moon or the surface of Mars. The high cost of powering an underwater camera for a long time, by tethering it to a research vessel or sending a ship to recharge its batteries, is a steep challenge preventing widespread undersea exploration.

 

MIT researchers have taken a major step to overcome this problem by developing a battery-free, wireless underwater camera that is about 100,000 times more energy-efficient than other undersea cameras. The device takes color photos, even in dark underwater environments, and transmits image data wirelessly through the water. The autonomous camera is powered by sound. It converts mechanical energy from sound waves traveling through water into electrical energy that powers its imaging and communications equipment. After capturing and encoding image data, the camera also uses sound waves to transmit data to a receiver that reconstructs the image.

 

Because it doesn't need a power source, the camera could run for weeks on end before retrieval, enabling scientists to search remote parts of the ocean for new species. It could also be used to capture images of ocean pollution or monitor the health and growth of fish raised in aquaculture farms. "One of the most exciting applications of this camera for me personally is in the context of climate monitoring. We are building climate models, but we are missing data from over 95 percent of the ocean. This technology could help us build more accurate climate models and better understand how climate change impacts the underwater world," says Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab, and senior author of the paper.

 

Going battery-free

To build a camera that could operate autonomously for long periods, the researchers needed a device that could harvest energy underwater on its own while consuming very little power.

The camera acquires energy using transducers made from piezoelectric materials that are placed around its exterior. Piezoelectric materials produce an electric signal when a mechanical force is applied to them. When a sound wave traveling through the water hits the transducers, they vibrate and convert that mechanical energy into electrical energy. Those sound waves could come from any source, like a passing ship or marine life. The camera stores harvested energy until it has built up enough to power the electronics that take photos and communicate data.

 

To keep power consumption as a low as possible, the researchers used off-the-shelf, ultra-low-power imaging sensors. But these sensors only capture grayscale images. And since most underwater environments lack a light source, they needed to develop a low-power flash, too. "We were trying to minimize the hardware as much as possible, and that creates new constraints on how to build the system, send information, and perform image reconstruction. It took a fair amount of creativity to figure out how to do this," Adib says. They solved both problems simultaneously using red, green, and blue LEDs. When the camera captures an image, it shines a red LED and then uses image sensors to take the photo. It repeats the same process with green and blue LEDs.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Astonishing new plane could make flights from London to New York in 80 minutes

Astonishing new plane could make flights from London to New York in 80 minutes | Amazing Science | Scoop.it

An astonishing new plane could cut the time of transatlantic flights by more than five times, claims Spanish designer Oscar Viñals. The latest in a series of futuristic designs going viral, Oscar’s images show a super streamlined jet – dubbed the Hyper Sting – soaring through the sky, transporting travelers from place to place in record time.

 

The idea uses a combination of theoretical cold fusion nuclear systems and innovative Mach 3.5 technology. Mach is the ratio of the speed of a body to the speed of sound – Mach 1 is the speed of sound, Mach 2 is twice the speed of sound, and so on. The plane would shoot through airspace at almost 2,500mph – almost five times the current average speed of commercial passenger planes that take up to eight hours to complete the journey from London to New York. If plans for the jet came to fruition, it would take just 80 minutes to complete the popular transatlantic route.

 

Concorde, the last supersonic commercial carrier, was powered by four Rolls-Royce/Snecma Olympus 593 turbojets and completed the popular route in a record two hours, 52 minutes and 59 seconds. It was seen as the ultimate development in aviation and enabled business trips to be condensed into a matter of days – even, controversially, hours – before the rise of web chat technology.

Concorde’s rule of the skies came to an end after many decades of operation when it was decommissioned in 2003, which contributed to a lack of demand due to rising costs; and the tragic case of the fatal July 2000 crash.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

MIT’s MOXIE reliably produces oxygen on Mars

MIT’s MOXIE reliably produces oxygen on Mars | Amazing Science | Scoop.it
MIT’s MOXIE experiment has now produced oxygen on Mars. It is the first demonstration of in-situ resource utilization on the Red Planet, and a key step in the goal of sending humans on a Martian mission.

 

On the red and dusty surface of Mars, nearly 100 million miles from Earth, an instrument the size of a lunchbox is proving it can reliably do the work of a small tree. The MIT-led Mars Oxygen In-Situ Resource Utilization Experiment, or MOXIE, has been successfully making oxygen from the Red Planet's carbon-dioxide-rich atmosphere since February 2021, when it touched down on the Martian surface as part of NASA's Perseverance rover mission.

 

In a recently published study in the journal Science Advances, researchers report that, by the end of 2021, MOXIE was able to produce oxygen on seven experimental runs, in a variety of atmospheric conditions, including during the day and night, and through different Martian seasons. In each run, the instrument reached its target of producing six grams of oxygen per hour -- about the rate of a modest tree on Earth. Researchers envision that a scaled-up version of MOXIE could be sent to Mars ahead of a human mission, to continuously produce oxygen at the rate of several hundred trees. At that capacity, the system should generate enough oxygen to both sustain humans once they arrive, and fuel a rocket for returning astronauts back to Earth.

 

So far, MOXIE's steady output is a promising first step toward that goal. "We have learned a tremendous amount that will inform future systems at a larger scale," says Michael Hecht, principal investigator of the MOXIE mission at MIT's Haystack Observatory. MOXIE's oxygen production on Mars also represents the first demonstration of "in-situ resource utilization," which is the idea of harvesting and using a planet's materials (in this case, carbon dioxide on Mars) to make resources (such as oxygen) that would otherwise have to be transported from Earth.

 

"This is the first demonstration of actually using resources on the surface of another planetary body, and transforming them chemically into something that would be useful for a human mission," says MOXIE deputy principal investigator Jeffrey Hoffman, a professor of the practice in MIT's Department of Aeronautics and Astronautics. "It's historic in that sense."

 

Hoffman and Hecht's MIT co-authors include MOXIE team members Jason SooHoo, Andrew Liu, Eric Hinterman, Maya Nasr, Shravan Hariharan, and Kyle Horn, along with collaborators from multiple institutions including NASA's Jet Propulsion Laboratory, which managed MOXIE's development, flight software, packaging, and testing prior to launch.

 

Seasonal air

The current version of MOXIE is small by design, in order to fit aboard the Perseverance rover, and is built to run for short periods, starting up and shutting down with each run, depending on the rover's exploration schedule and mission responsibilities. In contrast, a full-scale oxygen factory would include larger units that would ideally run continuously.

 

Despite the necessary compromises in MOXIE's current design, the instrument has shown it can reliably and efficiently convert Mars' atmosphere into pure oxygen. It does so by first drawing the Martian air in through a filter that cleans it of contaminants. The air is then pressurized, and sent through the Solid OXide Electrolyzer (SOXE), an instrument developed and built by OxEon Energy, that electrochemically splits the carbon dioxide-rich air into oxygen ions and carbon monoxide. The oxygen ions are then isolated and recombined to form breathable, molecular oxygen, or O2, which MOXIE then measures for quantity and purity before releasing it harmlessly back into the air, along with carbon monoxide and other atmospheric gases.

 

Since the rover's landing in February 2021, MOXIE engineers have started up the instrument seven times throughout the Martian year, each time taking a few hours to warm up, then another hour to make oxygen before powering back down. Each run was scheduled for a different time of day or night, and in different seasons, to see whether MOXIE could accommodate shifts in the planet's atmospheric conditions.

 

"The atmosphere of Mars is far more variable than Earth," Hoffman notes. "The density of the air can vary by a factor of two through the year, and the temperature can vary by 100 degrees. One objective is to show we can run in all seasons." So far, MOXIE has shown that it can make oxygen at almost any time of the Martian day and year.

 

"The only thing we have not demonstrated is running at dawn or dusk, when the temperature is changing substantially," Hecht says. "We do have an ace up our sleeve that will let us do that, and once we test that in the lab, we can reach that last milestone to show we can really run any time."

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Mixed reality version of non-invasive neural brain-computer interface: Galea from OpenBCI

Mixed reality version of non-invasive neural brain-computer interface: Galea from OpenBCI | Amazing Science | Scoop.it

At CES 2023 in Las Vegas, neurotechnology company OpenBCI announced a new mixed reality version of their non-invasive neural interface platform, Galea. Customers of the Galea Beta Program can now order their device integrated with the Varjo XR-3. OpenBCI and Varjo previously announced their partnership and integration of Galea with Varjo’s VR-only Aero headset. Today’s announcement marks a significant expansion of Galea’s capabilities by unlocking mixed reality use cases that rely on the Varjo XR-3’s low-latency pass-through, depth awareness, and integrated hand tracking. 

Galea is a hardware and software platform that merges next-generation brain-computer interface technology with head-mounted displays. Galea is the first headset that simultaneously measures the user’s heart, skin, muscles, eyes, and brain. Galea’s multi-modal sensor network and integrated software dramatically simplify the process of collecting tightly-synchronized data from the brain and body. Galea unlocks new techniques for objectively measuring user experiences and internal states of mind.

“The decision to add a mixed reality version of Galea was driven largely by conversations we’ve been having with our early partners and Beta Program customers,” said OpenBCI Chief Commercial Officer, Joseph Artuso. “VR has been more of the focus for gaming and entertainment customers, but automotive, aviation, training, and industrial use-cases for Galea have focused more on how the biosensor data can be used in combination with mixed reality environments.”

This year at CES, Galea also received Innovation Award honors in the AR/VR and Wearables categories. Galea has also previously won a Unity Aerospace & Defense award and an AWE Auggie Award for Best Interaction Product.

 

Pre-orders for Galea beta devices are now open via galea.co. Beta devices will come fully integrated with either the Varjo Aero or XR-3 and will also support the ability to remove the head-mounted display entirely and use Galea’s multi-modal sensor network as a standalone device in real-world settings. The fifth and final batch of the Galea Beta Program will close on February 15, 2023. Beta devices are estimated to begin shipping in Q3 2023.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Fusion Technology Is Reaching a Turning Point That Could Change The Energy Game Completely

Fusion Technology Is Reaching a Turning Point That Could Change The Energy Game Completely | Amazing Science | Scoop.it

Our society faces the grand challenge of providing sustainable, secure, and affordable means of generating energy while trying to reduce carbon dioxide emissions to net zero around 2050. To date, developments in fusion power, which potentially ticks all these boxes, have been funded almost exclusively by the public sector. However, something is changing. Private equity investment in the global fusion industry has more than doubled in just one year – from US$2.1 billion in 2021 to US$4.7 billion in 2022, according to a survey from the Fusion Industry Association. So, what is driving this recent change? There's lots to be excited about.

 

Merging atoms together

Fusion works the same way our Sun does, by merging two heavy hydrogen atoms under extreme heat and pressure to release vast amounts of energy. It's the opposite of the fission process used by nuclear power plants, in which atoms are split to release large amounts of energy. Sustaining nuclear fusion at scale has the potential to produce a safe, clean, almost inexhaustible power source. Our Sun sustains fusion at its core with a plasma of charged particles at around 15 million degrees Celsius. Down on Earth, we are aiming for hundreds of millions of degrees Celsius, because we don't have the enormous mass of the Sun compressing the fuel down for us.

 

Scientists and engineers have worked out several designs for how we might achieve this, but most fusion reactors use strong magnetic fields to "bottle" and confine the hot plasma.

Generally, the main challenge to overcome on our road to commercial fusion power is to provide environments that can contain the intense burning plasma needed to produce a fusion reaction that is self-sustaining, producing more energy than was needed to get it started.

 

Fusion development has been progressing since the 1950s. Most of it was driven by government funding for fundamental science. Now, a growing number of private fusion companies around the world are forging ahead toward commercial fusion energy. A change in government attitudes has been crucial to this. The US and UK governments are fostering public-private partnerships to complement their strategic research programs. For example, the White House recently announced it would develop a "bold decadal vision for commercial fusion energy". In the United Kingdom, the government has invested in a program aimed at connecting a fusion generator to the national electricity grid.

 

Now, Arizona scientists may have discovered a source of unlimited clean energy by recreating the process of nuclear fusion which powers the sun.  Researchers at the National Ignition Facility at the Lawrence Livermore National Lab in California were able to spark a fusion reaction that briefly sustained itself - a major feat because fusion requires such high temperatures and pressures that it easily fizzles out. The first experiment was performed in August 2022, but similar tests have been performed before. However, this was the first one that generated more energy than was used to create the experiment - meaning scientists could now harness nuclear fusion as an energy source. The August test actually generated more energy than scientists predicted, and damaged some equipment. 

 

But it could now represent a groundbreaking moment in humankind's move away from fossil fuels like oil and coal to completely clean energy sources that do not pollute the air, or scar landscapes with mining or pipelines.  The ultimate goal, still years away, is to generate power the way the sun generates heat, by pushing hydrogen atoms so close to each other that they combine into helium, which releases torrents of energy. A single cupful of that substance could power an average-sized house for hundreds of years, with no carbon emissions. Using the world's largest laser, consisting of 192 beams and temperatures more than three times hotter than the center of the sun, the researchers coaxed fusion fuel for the first time to heat itself beyond the heat they zapped into it - achieving a net energy gain.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CubeSat Set to Demonstrate NASA’s Fastest Laser Link from Space

NASA’s Pathfinder Technology Demonstrator 3 (PTD-3) mission, carrying the TeraByte InfraRed Delivery (TBIRD) system, will debut on May 25 as part of SpaceX’s Transporter-5 rideshare launch. TBIRD will showcase the high-data-rate capabilities of laser communications from a CubeSat in low-Earth orbit. At 200 gigabits per second (Gbps), TBIRD will downlink data at the highest optical rate ever achieved by NASA.

 

NASA primarily uses radio frequency to communicate with spacecraft, but with sights set on human exploration of the Moon and Mars and the development of enhanced scientific instruments, NASA needs more efficient communications systems to transmit significant amounts of data. With more data, researchers can make profound discoveries. Laser communications substantially increases data transport capabilities, offering higher data rates and more information packed into a single transmission.

 

“TBIRD is a game changer and will be very important for future human exploration and science missions.” said Andreas Doulaveris, TBIRD’s mission systems engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. With a single seven-minute pass at 200 Gbps, TBIRD will send back terabytes of data and give NASA more insight into the capabilities of laser communications. The addition of laser communications to spacecraft is similar to switching from dial-up to high-speed internet. 

 

“As future science instruments and imaging systems incorporate the latest technology advancements, they’ll return very large volumes of data on a daily basis,” said Jason Mitchell, Director of the Advanced Communications and Navigation Technology division within NASA’s Space Communications and Navigation (SCaN) program. “These missions will need the downlink capabilities that laser communications can provide.”

 

The TBIRD system, funded by SCaN and built by the Massachusetts Institute of Technology Lincoln Laboratory in Lexington, is about the size of a tissue box and is integrated into PTD-3, a CubeSat that is the size of two stacked cereal boxes.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New technology creates carbon neutral chemicals out of captured carbon dioxide

New technology creates carbon neutral chemicals out of captured carbon dioxide | Amazing Science | Scoop.it

The technology could allow scientists to both capture CO2 and transform it into useful chemicals such as carbon monoxide and synthetic natural gas in one circular process. 

 

Dr Melis Duyar, Senior Lecturer of Chemical Engineering at the University of Surrey commented:  “Capturing CO2 from the surrounding air and directly converting it into useful products is exactly what we need to approach carbon neutrality in the chemicals sector. This could very well be a milestone in the steps needed for the UK to reach its 2050 net-zero goals. We need to get away from our current thinking on how we produce chemicals, as current practices rely on fossil fuels which are not sustainable. With this technology we can supply chemicals with a much lower carbon footprint and look at replacing fossil fuels with carbon dioxide and renewable hydrogen as the building blocks of other important chemicals.” 

 

The technology uses patent-pending switchable Dual Function Materials (DFMs), that capture carbon dioxide on their surface and catalyse the conversion of captured CO2 directly into chemicals. The “switchable” nature of the DFMs comes from their ability to produce multiple chemicals depending on the operating conditions or the composition of the added reactant. This makes the technology responsive to variations in demand for chemicals as well as availability of renewable hydrogen as a reactant.  

 

Dr Duyar continued: “These outcomes are a testament to the research excellence at Surrey, with continuously improving facilities, internal funding schemes and a collaborative culture.” 

Loukia-Pantzechroula Merkouri, Postgraduate student leading this research at the University of Surrey added:  “Not only does this research demonstrate a viable solution to the production of carbon neutral fuels and chemicals, but it also offers an innovative approach to combat the ever-increasing CO2 emissions contributing to global warming.”  

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Waymo Robotaxis Open to Public Transportation in Phoenix, AZ, Granted San Francisco License

Waymo Robotaxis Open to Public Transportation in Phoenix, AZ, Granted San Francisco License | Amazing Science | Scoop.it

In San Francisco, California, the self-driving tech company – owned by Google parent Alphabet – moved a step closer to launching a fully autonomous commercialized ride-hailing service, as is currently being operated by its chief rival, General Motors-owned Cruise.

 

And in Phoenix, Arizona, Waymo’s driverless service has been made available to members of the general public in the central Downtown area. The breakthrough in San Francisco has come via the approval by the California Department of Motor Vehicles to an amendment of the company’s current permit to operate. Now Waymo will be able to charge fees for driverless services in its autonomous vehicles (AVs), such as deliveries.

 

Once it has operated a driverless service on public roads in the city for a total of 30 days, it will then be eligible to submit an application to the California Public Utilities Commission (CPUC) for a permit that would enable it to charge fares for passenger-only autonomous rides in its vehicles.

 

This is the same permit that provided the greenlight for Cruise’s commercial driverless ride-hail service at the start of June. The CPUC awarded a drivered deployment permit to Waymo in February which allowed the company to charge its ‘trusted testers’ for autonomous rides with a safety operator on board. The trusted tester program comprises vetted members of the public who have applied to use the service and have signed an NDA which means they will not talk about their experiences publicly. In downtown Phoenix, the extension of the driverless ride-hail service is the latest evidence of the incremental progress Waymo has made in the city.

 

Over the past couple of years, the company has operated a paid rider-only service in some of Phoenix’s eastern suburbs, such as Gilbert, Mesa, Chandler and Tempe. Earlier this year it moved into the busier, more central downtown area, where driverless rides were made available for trusted testers. It also trialled an autonomous service for employees at Phoenix Sky Harbor International Airport, albeit with a safety operator on board. In early November, it was confirmed that airport rides would be offered to trusted testers, although again Waymo made clear that there would be a specialist in the driver’s seat, initially at least.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Moore’s Law – Now and in the Future: INTEL predicts 1 Trillion transistors on a single 3D chip by 2030

Moore’s Law – Now and in the Future: INTEL predicts 1 Trillion transistors on a single 3D chip by 2030 | Amazing Science | Scoop.it

Moore’s Law predicts that the number of transistors per device will double every two years. Moore’s Law is and always has been driven by innovation. The above figure illustrates the number of transistors per device as we look to the past, the present and the future. For the first 40 years, the gains came primarily from innovations in our process. Going forward, gains will come from innovations in both process and packaging. INTEL's processes will continue to deliver historic density improvements, while its 2D and 3D stacking technologies give architects and designers more tools to increase the number of transistors per device. As the designers look forward to innovative technologies such as High NA, RibbonFET, PowerVia, Foveros Omni and Direct, and others, INTEL sees no end to innovation and therefore currently no end to Moore’s Law.

In summary, when we consider all the various process and advanced packaging innovations, there are numerous options available to continue to double the number of transistors per device at the cadence demanded by our customers. Moore’s Law only stops when innovation stops, and innovation continues unabated at Intel in process, packaging and architecture. We remain undeterred in our aspiration to deliver approximately 1 trillion transistors in a single device by 2030.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

SimpleRecon - 3D Reconstruction Without 3D Convolutions makes fast and accurate reconstruction possible

Website is here

 

Traditionally, 3D indoor scene reconstruction from posed images happens in two phases: per image depth estimation, followed by depth merging and surface reconstruction. Recently, a family of methods have emerged that perform reconstruction directly in final 3D volumetric feature space. While these methods have shown impressive reconstruction results, they rely on expensive 3D convolutional layers, limiting their application in resource-constrained environments. In this work, researchers instead go back to the traditional route, and show how focusing on high quality multi-view depth prediction leads to highly accurate 3D reconstructions using simple off-the-shelf depth fusion. They propose a simple state-of-the-art multi-view depth estimator with two main contributions: 1) a carefully-designed 2D CNN which utilizes strong image priors alongside a plane-sweep feature volume and geometric losses, combined with 2) the integration of keyframe and geometric metadata into the cost volume which allows informed depth plane scoring. This method achieves a significant lead over the current state-of-the-art for depth estimation and close or better for 3D reconstruction on ScanNet and 7-Scenes, yet still allows for online real-time low-memory reconstruction. SimpleRecon is fast. The batch size one performance is 70ms per frame. This makes accurate reconstruction via fast depth fusion possible!

 

SimpleRecon: 3D Reconstruction Without 3D Convolutions
Mohamed Sayed, John Gibson, Jamie Whatson, Victor Adrian Prisacariu, Michael Firman, and Clément Godard
ECCV 2022

https://nianticlabs.github.io/simplerecon/
https://github.com/nianticlabs/simplerecon

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Novel Solar Harvesting System has the Potential to Generate Solar Power 24/7

Novel Solar Harvesting System has the Potential to Generate Solar Power 24/7 | Amazing Science | Scoop.it

The great inventor Thomas Edison once said, "So long as the sun shines, man will be able to develop power in abundance." His wasn't the first great mind to marvel at the notion of harnessing the power of the sun; for centuries inventors have been pondering and perfecting the way to harvest solar energy. They've done an amazing job with photovoltaic cells which convert sunlight directly into energy. And still, with all the research, history and science behind it, there are limits to how much solar power can be harvested and used -- as its generation is restricted only to the daytime.

 

A University of Houston professor is continuing the historic quest, reporting on a new type of solar energy harvesting system that breaks the efficiency record of all existing technologies. And no less important, it clears the way to use solar power 24/7. "With our architecture, the solar energy harvesting efficiency can be improved to the thermodynamic limit," reports Bo Zhao, Kalsi Assistant Professor of mechanical engineering and his doctoral student Sina Jafari Ghalekohneh in the journal Physical Review Applied. The thermodynamic limit is the absolute maximum theoretically possible conversion efficiency of sunlight into electricity.

 

Finding more efficient ways to harness solar energy is critical to transitioning to a carbon-free electric grid. According to a recent study by the U.S. Department of Energy Solar Energy Technologies Office and the National Renewable Energy Laboratory, solar could account for as much as 40% of the nation's electricity supply by 2035 and 45% by 2050, pending aggressive cost reductions, supportive policies and large-scale electrification.

 

How Does it Work?

Traditional solar thermo-photovoltaics (STPV) rely on an intermediate layer to tailor sunlight for better efficiency. The front side of the intermediate layer (the side facing the sun) is designed to absorb all photons coming from the sun. In this way, solar energy is converted to thermal energy of the intermediate layer and elevates the temperature of the intermediate layer. But the thermodynamic efficiency limit of STPVs, which has long been understood to be the blackbody limit (85.4%), is still far lower than the Landsberg limit (93.3%), the ultimate efficiency limit for solar energy harvesting.

 

"In this work, we show that the efficiency deficit is caused by the inevitable back emission of the intermediate layer towards the sun resulting from the reciprocity of the system. We propose nonreciprocal STPV systems that utilize an intermediate layer with nonreciprocal radiative properties," said Zhao. "Such a nonreciprocal intermediate layer can substantially suppress its back emission to the sun and funnel more photon flux towards the cell. We show that, with such improvement, the nonreciprocal STPV system can reach the Landsberg limit, and practical STPV systems with single-junction photovoltaic cells can also experience a significant efficiency boost."

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

AI models can now continually learn from new data on intelligent devices like smartphones

AI models can now continually learn from new data on intelligent devices like smartphones | Amazing Science | Scoop.it
A new technique enables on-device training of machine-learning models on edge devices like microcontrollers, which have very limited memory. This could allow edge devices to continually learn from new data, eliminating data privacy issues, while enabling user customization.

 

Microcontrollers, miniature computers that can run simple commands, are the basis for billions of connected devices, from internet-of-things (IoT) devices to sensors in automobiles. But cheap, low-power microcontrollers have extremely limited memory and no operating system, making it challenging to train artificial intelligence models on "edge devices" that work independently from central computing resources.

 

Training a machine-learning model on an intelligent edge device allows it to adapt to new data and make better predictions. For instance, training a model on a smart keyboard could enable the keyboard to continually learn from the user's writing. However, the training process requires so much memory that it is typically done using powerful computers at a data center, before the model is deployed on a device. This is more costly and raises privacy issues since user data must be sent to a central server.

 

To address this problem, researchers at MIT and the MIT-IBM Watson AI Lab developed a new technique that enables on-device training using less than a quarter of a megabyte of memory. Other training solutions designed for connected devices can use more than 500 megabytes of memory, greatly exceeding the 256-kilobyte capacity of most microcontrollers (there are 1,024 kilobytes in one megabyte).

 

The intelligent algorithms and framework the researchers developed reduce the amount of computation required to train a model, which makes the process faster and more memory efficient. Their technique can be used to train a machine-learning model on a microcontroller in a matter of minutes. This technique also preserves privacy by keeping data on the device, which could be especially beneficial when data are sensitive, such as in medical applications. It also could enable customization of a model based on the needs of users. Moreover, the framework preserves or improves the accuracy of the model when compared to other training approaches.

 

"Our study enables IoT devices to not only perform inference but also continuously update the AI models to newly collected data, paving the way for lifelong on-device learning. The low resource utilization makes deep learning more accessible and can have a broader reach, especially for low-power edge devices," says Song Han, an associate professor in the Department of Electrical Engineering and Computer Science (EECS), a member of the MIT-IBM Watson AI Lab, and senior author of the paper describing this innovation.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Smelling in VR environment possible with new gaming technology

Smelling in VR environment possible with new gaming technology | Amazing Science | Scoop.it

An odor machine, so-called olfactometer, makes it possible to smell in VR environments. First up is a "wine tasting game" where the user smells wine in a virtual wine cellar and gets points if the guess on aromas in each wine is correct. The new technology that can be printed on 3D printers has been developed in collaboration between Stockholm University and Malmö University. The research, funded by the Marianne and Marcus Wallenberg Foundation, was recently published in the International Journal of Human -- Computer Studies.

 

"We hope that the new technical possibilities will lead to scents having a more important role in game development, says Jonas Olofsson, professor of psychology and leader of the research project at Stockholm University.

 

In the past, computer games have focused mostly on what we can see -- moving images on screens. Other senses have not been present. But an interdisciplinary research group at Stockholm University and Malmö University has now constructed a scent machine that can be controlled by a gaming computer. In the game, the participant moves in a virtual wine cellar, picking up virtual wine glasses containing different types of wine, guessing the aromas. The small scent machine is attached to the VR system's controller, and when the player lifts the glass, it releases a scent.

 

"The possibility to move on from a passive to a more active sense of smell in the game world paves the way for the development of completely new smell-based game mechanics based on the players' movements and judgments," says Simon Niedenthal, interaction and game researcher at Malmö University.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Researchers used cryo-electron microscopy to reveal the structure of bacteria's 'propellers' in near atomic detail

Researchers used cryo-electron microscopy to reveal the structure of bacteria's 'propellers' in near atomic detail | Amazing Science | Scoop.it

University of Virginia School of Medicine researchers and their collaborators have solved a decades-old mystery about how E. coli and other bacteria are able to move. Bacteria push themselves forward by coiling long, threadlike appendages into corkscrew shapes that act as makeshift propellers. But how exactly they do this has baffled scientists, because the "propellers" are made of a single protein.

 

An international team led by UVA's Edward H. Egelman, PhD, a leader in the field of high-tech cryo-electron microscopy (cryo-EM), has cracked the case. The researchers used cryo-EM and advanced computer modeling to reveal what no traditional light microscope could see: the strange structure of these propellers at the level of individual atoms.

 

"While models have existed for 50 years for how these filaments might form such regular coiled shapes, we have now determined the structure of these filaments in atomic detail," said Egelman, of UVA's Department of Biochemistry and Molecular Genetics.

 

"We can show that these models were wrong, and our new understanding will help pave the way for technologies that could be based upon such miniature propellers."

 

Blueprints for Bacteria's 'Supercoils'

Different bacteria have one or many appendages known as a flagellum, or, in the plural, flagella. A flagellum is made of thousands of subunits, but all these subunits are exactly the same. You might think that such a tail would be straight, or at best a bit flexible, but that would leave the bacteria unable to move. That's because such shapes can't generate thrust. It takes a rotating, corkscrew-like propeller to push a bacterium forward. Scientists call the formation of this shape "supercoiling," and now, after more than 50 years, they understand how bacteria do it.

 

Using cryo-EM, Egelman and his team found that the protein that makes up the flagellum can exist in 11 different states. It is the precise mixture of these states that causes the corkscrew shape to form. It has been known that the propeller in bacteria is quite different than similar propellers used by hearty one-celled organisms called archaea. Archaea are found in some of the most extreme environments on Earth, such as in nearly boiling pools of acid, the very bottom of the ocean and in petroleum deposits deep in the ground.

 

Egelman and colleagues used cryo-EM to examine the flagella of one form of archaea, Saccharolobus islandicus, and found that the protein forming its flagellum exists in 10 different states. While the details were quite different than what the researchers saw in bacteria, the result was the same, with the filaments forming regular corkscrews. They conclude that this is an example of "convergent evolution" -- when nature arrives at similar solutions via very different means. This shows that even though bacteria and archaea's propellers are similar in form and function, the organisms evolved those traits independently.

 

"As with birds, bats and bees, which have all independently evolved wings for flying, the evolution of bacteria and archaea has converged on a similar solution for swimming in both," said Egelman, whose prior imaging work saw him inducted into the National Academy of Sciences, one of the highest honors a scientist can receive. "Since these biological structures emerged on Earth billions of years ago, the 50 years that it has taken to understand them may not seem that long."

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Tiny Swimming MicroRobots Made from Algae Cells Treat Deadly Pneumonia in Mice Efficiently

Tiny Swimming MicroRobots Made from Algae Cells Treat Deadly Pneumonia in Mice Efficiently | Amazing Science | Scoop.it

Nanoengineers at the University of California San Diego have developed microscopic robots, called microrobots, that can swim around in the lungs, deliver medication and be used to clear up life-threatening cases of bacterial pneumonia.

 

In mice, the microrobots safely eliminated pneumonia-causing bacteria in the lungs and resulted in 100% survival. By contrast, untreated mice all died within three days after infection. The results are published Sept. 22, 2022 in Nature Materials. The microrobots are made of algae cells whose surfaces are speckled with antibiotic-filled nanoparticles. The algae provide movement, which allows the microrobots to swim around and deliver antibiotics directly to more bacteria in the lungs. The nanoparticles containing the antibiotics are made of tiny biodegradable polymer spheres that are coated with the cell membranes of neutrophils, which are a type of white blood cell. What's special about these cell membranes is that they absorb and neutralize inflammatory molecules produced by bacteria and the body's immune system. This gives the microrobots the ability to reduce harmful inflammation, which in turn makes them more effective at fighting lung infection.

 

The work is a joint effort between the labs of nanoengineering professors Joseph Wang and Liangfang Zhang, both at the UC San Diego Jacobs School of Engineering. Wang is a world leader in the field of micro- and nanorobotics research, while Zhang is a world leader in developing cell-mimicking nanoparticles for treating infections and diseases. Together, they have pioneered the development of tiny drug-delivering robots that can be safely used in live animals to treat bacterial infections in the stomach and blood. Treating bacterial lung infections is the latest in their line of work. "Our goal is to do targeted drug delivery into more challenging parts of the body, like the lungs. And we want to do it in a way that is safe, easy, biocompatible and long lasting," said Zhang. "That is what we've demonstrated in this work."

 

The team used the microrobots to treat mice with an acute and potentially fatal form of pneumonia caused by the bacteria Pseudomonas aeruginosa. This form of pneumonia commonly affects patients who receive mechanical ventilation in the intensive care unit. The researchers administered the microrobots to the lungs of the mice through a tube inserted in the windpipe. The infections fully cleared up after one week. All mice treated with the microrobots survived past 30 days, while untreated mice died within three days.

 

Treatment with the microrobots was also more effective than an IV injection of antibiotics into the bloodstream. The latter required a dose of antibiotics that was 3000 times higher than that used in the microrobots to achieve the same effect. For comparison, a dose of microrobots provided 500 nanograms of antibiotics per mouse, while an IV injection provided 1.644 milligrams of antibiotics per mouse.

 

The team's approach is so effective because it puts the medication right where it needs to go rather than diffusing it through the rest of the body. "These results show how targeted drug delivery combined with active movement from the microalgae improves therapeutic efficacy," said Wang.

No comment yet.