Amazing Science
1.1M views | +23 today
Follow
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

AI enables accurate electronic structure calculations at large scales

AI enables accurate electronic structure calculations at large scales | Amazing Science | Scoop.it
 

The arrangement of electrons in matter, known as the electronic structure, plays a crucial role in fundamental but also applied research, such as drug design and energy storage. However, the lack of a simulation technique that offers both high fidelity and scalability across different time and length scales has long been a roadblock for the progress of these technologies.

 

Researchers from the Center for Advanced Systems Understanding (CASUS) at the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) in Görlitz, Germany, and Sandia National Laboratories in Albuquerque, New Mexico, U.S., have now pioneered a machine learning–based simulation method that supersedes traditional electronic structure simulation techniques.

Their Materials Learning Algorithms (MALA) software stack enables access to previously unattainable length scales. The work is published in the journal npj Computational Materials.

 

Electrons are elementary particles of fundamental importance. Their quantum mechanical interactions with one another and with atomic nuclei give rise to a multitude of phenomena observed in chemistry and materials science. Understanding and controlling the electronic structure of matter provides insights into the reactivity of molecules, the structure and energy transport within planets, and the mechanisms of material failure.

Scientific challenges are increasingly being addressed through computational modeling and simulation, leveraging the capabilities of high-performance computing. However, a significant obstacle to achieving realistic simulations with quantum precision is the lack of a predictive modeling technique that combines high accuracy with scalability across different length and time scales.

 

Classical atomistic simulation methods can handle large and complex systems, but their omission of quantum electronic structure restricts their applicability. Conversely, simulation methods which do not rely on assumptions such as empirical modeling and parameter fitting (first principles methods) provide high fidelity but are computationally demanding. For instance, density functional theory (DFT), a widely used first principles method, exhibits cubic scaling with system size, thus restricting its predictive capabilities to small scales.

Hybrid approach based on deep learning

The team of researchers now presented a novel simulation method called the Materials Learning Algorithms (MALA) software stack. In computer science, a software stack is a collection of algorithms and software components that are combined to create a software application for solving a particular problem.

 

Lenz Fiedler, a Ph.D. student and key developer of MALA at CASUS, explains, "MALA integrates machine learning with physics-based approaches to predict the electronic structure of materials. It employs a hybrid approach, utilizing an established machine learning method called deep learning to accurately predict local quantities, complemented by physics algorithms for computing global quantities of interest."

Tanja Elbaz's curator insight, November 12, 2023 7:45 PM
 
  •  

Oxycodone without a prescription

200.0$  1,300.0$
 
 
  •  

Phentermine 37.5 mg for sale

180.0$  450.0$
-33%
 
  •  

Phentremin weight loss

150.0$ 100.0$
-83%
 
  •  

purchase Adderall online

1,500.0$ 250.0$
 
 
  •  

Where Fentanyl Patches online

200.0$  500.0$
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Chemists Believe They Have Cracked The Complete Quantum Nature of Water Using AI

Chemists Believe They Have Cracked The Complete Quantum Nature of Water Using AI | Amazing Science | Scoop.it

Chemists have produced the first full quantum mechanical model of water — one of the key ingredients of life. The Journal of Physical Chemistry Letters published the breakthrough, which used machine learning to develop a model that gives a detailed, accurate description for how large groups of water molecules interact with one another. “We believe we have found the missing piece to a complete, microscopic understanding of water,” says Joel Bowman, professor of theoretical chemistry at Emory University and senior author of the study. “It appears that we now have all that we need to know to describe water molecules under any conditions, including ice, liquid or vapor over a range of temperature and pressure.”

 

The researchers developed free, open-source software for the model, which they dubbed “q-AQUA.”  The q-AQUA software provides a universal tool for studying water. “We anticipate researchers using it for everything from predicting whether an exoplanet may have water to deepening our understanding of the role of water in cellular function,” Bowman says.

 

Bowman is one of the founders of the specialty of theoretical reaction dynamics and a leader in exploring mysteries underlying questions such as why we need water to live. First author of the study is Qi Yu, a former Emory PhD candidate in the Bowman Lab who has since graduated and is now a postdoctoral fellow at Yale. Co-authors include Emory graduate student Apurba Nandi, a PhD candidate in the Bowman Lab; Riccardo Cone, a former Emory postdoctoral fellow in the Bowman Lab, who is now at the University of Milan; and Paul Houston, former dean of science at Georgia Institute of Technology and now an emeritus professor at Cornell University.

 

The discovery made the cover of the Journal of Physical Chemistry Letters.

 

Water covers most of the Earth’s surface and is vital to all living organisms. It consists of simple molecules, each made up of two hydrogen atoms and one oxygen atom, bound by hydrogen.

Despite water’s simplicity and ubiquity, describing the interactions of clusters of H2O molecules under any conditions presents major challenges. Newton’s law governs the behavior of heavy objects in the so-called classical world, including the motion of planets. Extremely light objects, however, at the level of atoms and electrons, are part of the quantum world which is governed by the Schrodinger equation of quantum-mechanical systems.

 

Each water molecule consists of a single oxygen atom and two hydrogen atoms. “We’re about 70% water by weight,” Bowman says, “and yet, from a chemical standpoint, we don’t really understand how water molecules interact with biological systems."

 

Although large, complex problems in the classical world can be divided into pieces to be solved, objects in the quantum world are too “fuzzy” to be broken down into discrete pieces. Researchers have tried to produce a quantum model of water by breaking it into the interactions of clusters of water molecules. Bowman compares it to people at a party clustered into conversational groups of two, three or four people.

 

“Imagine you’re trying to come up with a model to describe the conversations in each of these clusters of people that can be extended to the entire party,” he says. “First you gather the data for two people talking and determine what they are saying, who is saying what and what the conversation means. It gets harder when you try to model the conversations among three people. And when you get up to four people, it gets nearly impossible because so much data is coming at you.”

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Machine learning technique speeds up crystal structure determination

Machine learning technique speeds up crystal structure determination | Amazing Science | Scoop.it

Nanoengineers at the University of California San Diego have developed a computer-based method that could make it less labor-intensive to determine the crystal structures of various materials and molecules, including alloys, proteins and pharmaceuticals. The method uses a machine learning algorithm, similar to the type used in facial recognition and self-driving cars, to independently analyze electron diffraction patterns, and do so with at least 95% accuracy. The work is published in the Jan. 31 issue of Science.

 

A team led by UC San Diego nanoengineering professor Kenneth Vecchio and his Ph.D. student Kevin Kaufmann, who is the first author of the paper, developed the new approach. Their method involves using a scanning electron microscope (SEM) to collect electron backscatter diffraction (EBSD) patterns. Compared to other electron diffraction techniques, such as those in transmission electron microscopy (TEM), SEM-based EBSD can be performed on large samples and analyzed at multiple length scales. This provides local sub-micron information mapped to centimeter scales. For example, a modern EBSD system enables determination of fine-scale grain structures, crystal orientations, relative residual stress or strain, and other information in a single scan of the sample.

 

However, the drawback of commercial EBSD systems is the software's inability to determine the atomic structure of the crystalline lattices present within the material being analyzed. This means a user of the commercial software must select up to five crystal structures presumed to be in the sample and then the software attempts to find probable matches to the diffraction pattern. The complex nature of the diffraction pattern often causes the software to find false structure matches in the user selected list. As a result, the accuracy of the existing software's determination of the lattice type is dependent on the operator's experience and prior knowledge of their sample.

 

The method that Vecchio's team developed does this all autonomously, as the deep neural network independently analyzes each diffraction pattern to determine the crystal lattice, out of all possible lattice structure types, with a high degree of accuracy (greater than 95%).

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Materials Made of Mechanical Neural Networks Can Learn to Adapt Their Physical Properties

Materials Made of Mechanical Neural Networks Can Learn to Adapt Their Physical Properties | Amazing Science | Scoop.it

A new type of material can learn and improve its ability to deal with unexpected forces thanks to a unique lattice structure with connections of variable stiffness, as described in a new paper. The new material is a type of architected material, which gets its properties mainly from the geometry and specific traits of its design rather than what it is made out of. Take hook-and-loop fabric closures like Velcro, for example. It doesn’t matter whether it is made from cotton, plastic or any other substance. As long as one side is a fabric with stiff hooks and the other side has fluffy loops, the material will have the sticky properties of Velcro.

 

The new material’s architecture is based on that of an artificial neural network—layers of interconnected nodes that can learn to do tasks by changing how much importance, or weight, they place on each connection. Theoretically, such a mechanical lattice with physical nodes could be trained to take on certain mechanical properties by adjusting each connection’s rigidity.

 

To find out if a mechanical lattice indeed would be able to adopt and maintain new properties—like taking on a new shape or changing directional strength—scientists started off by building a computer model. They then selected a desired shape for the material as well as input forces and had a computer algorithm tune the tensions of the connections so that the input forces would produce the desired shape. They did this training on 200 different lattice structures and found that a triangular lattice was best at achieving all of the shapes tested.

 

Once the many connections are tuned to achieve a set of tasks, the material will continue to react in the desired way. The training is—in a sense—remembered in the structure of the material itself.

 

The researchers then built a physical prototype lattice with adjustable electromechanical springs arranged in a triangular lattice. The prototype is made of 6-inch connections and is about 2 feet long by 1½ feet wide. And it worked beautifully. When the lattice and algorithm worked together, the material was able to learn and change shape in particular ways when subjected to different forces. The scientists call this new material a mechanical neural network.

No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Artificial intelligence identifies optimal material formula

Artificial intelligence identifies optimal material formula | Amazing Science | Scoop.it

Nanostructured layers boast countless potential properties -- but how can the most suitable one be identified without any long-term experiments? A team has ventured a shortcut: using a machine learning algorithm, the researchers were able to reliably predict the properties of such a layer.

 

Porous or dense, columns or fibers

During the manufacture of thin films, numerous control variables determine the condition of the surface and, consequently, its properties. Relevant factors include the composition of the layer as well as process conditions during its formation, such as temperature. All these elements put together result in the creation of either a porous or a dense layer during the coating process, with atoms combining to form columns or fibers. "In order to find the optimal parameters for an application, it used to be necessary to conduct countless experiments under different conditions and with different compositions; this is an incredibly complex process," explains Professor Alfred Ludwig, Head of the Materials Discovery and Interfaces Team.

 

Findings yielded by such experiments are so-called structure zone diagrams, from which the surface of a certain composition resulting from certain process parameters can be read. "Experienced researchers can subsequently use such a diagram to identify the most suitable location for an application and derive the parameters necessary for producing the suitable layer," points out Ludwig. "The entire process requires an enormous effort and is highly time consuming."

 

Algorithm predicts surface

Striving to find a shortcut towards the optimal material, the team took advantage of artificial intelligence, more precisely machine learning. To this end, PhD researcher Lars Banko, together with colleagues from the Interdisciplinary Centre for Advanced Materials Simulation at RUB, Icams for short, modified a so-called generative model. He then trained this algorithm to generate images of the surface of a thoroughly researched model layer of aluminum, chromium and nitrogen using specific process parameters, in order to predict what the layer would look like under the respective conditions.

 

"We fed the algorithm with a sufficient amount of experimental data in order to train it, but not with all known data," stresses Lars Banko. Thus, the researchers were able to compare the results of the calculations with those of the experiments and analyze how reliable its prediction was. The results were conclusive: "We combined five parameters and were able to look in five directions simultaneously using the algorithm -- without having to conduct any experiments at all," outlines Alfred Ludwig. "We have thus shown that machine learning methods can be transferred to materials research and can help to develop new materials for specific purposes."

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Papers
Scoop.it!

Can artificial intelligence create the next wonder material?

Can artificial intelligence create the next wonder material? | Amazing Science | Scoop.it

It's a strong contender for the geekiest video ever made: a close-up of a smartphone with line upon line of numbers and symbols scrolling down the screen. But when visitors stop by Nicola Marzari's office, which overlooks Lake Geneva, he can hardly wait to show it off. “It's from 2010,” he says, “and this is my cellphone calculating the electronic structure of silicon in real time!”

 

Even back then, explains Marzari, a physicist at the Swiss Federal Institute of Technology in Lausanne (EPFL), Switzerland, his now-ancient handset took just 40 seconds to carry out quantum-mechanical calculations that once took many hours on a supercomputer — a feat that not only shows how far such computational methods have come in the past decade or so, but also demonstrates their potential for transforming the way materials science is done in the future.

 

Instead of continuing to develop new materials the old-fashioned way — stumbling across them by luck, then painstakingly measuring their properties in the laboratory — Marzari and like-minded researchers are using computer modelling and machine-learning techniques to generate libraries of candidate materials by the tens of thousands. Even data from failed experiments can provide useful input1. Many of these candidates are completely hypothetical, but engineers are already beginning to shortlist those that are worth synthesizing and testing for specific applications by searching through their predicted properties — for example, how well they will work as a conductor or an insulator, whether they will act as a magnet, and how much heat and pressure they can withstand.

 

The hope is that this approach will provide a huge leap in the speed and efficiency of materials discovery, says Gerbrand Ceder, a materials scientist at the University of California, Berkeley, and a pioneer in this field. “We probably know about 1% of the properties of existing materials,” he says, pointing to the example of lithium iron phosphate: a compound that was first synthesized2 in the 1930s, but was not recognized3 as a promising replacement material for current-generation lithium-ion batteries until 1996. “No one had bothered to measure its voltage before,” says Ceder.

At least three major materials databases already exist around the world, each encompassing tens or hundreds of thousands of compounds. Marzari's Lausanne-based Materials Cloud project is scheduled to launch later this year. And the wider community is beginning to take notice. “We are now seeing a real convergence of what experimentalists want and what theorists can deliver,” says Neil Alford, a materials scientist who serves as vice-dean for research at Imperial College London, but who has no affiliation with any of the database projects.


Via Complexity Digest
No comment yet.