Podcast on "Orb: A Fast, Scalable Neural Network Potential"

Avatar
Poster
Voice is AI-generated
Description
An AI generated podcast covering the  paper titled "Orb: A Fast, Scalable Neural Network Potential" by Neumann et al. Key takeaways are of Orb models:

  • State-of-the-art performance: At the time of its release, Orb ranks among top SOTA foundation materials for materials prediction. 
  • Computational efficiency: Orb is around 2-6 times faster than its closest competitors, making it suitable for larger scale simulations.
  • Generalization ability: Orb exhibits stability in simulating diverse systems, including out-of-distribution small molecules and complex porous materials like MOFs.
Connected to paperThis paper is a preprint and has not been certified by peer review

Orb: A Fast, Scalable Neural Network Potential

Authors

Mark Neumann, James Gin, Benjamin Rhodes, Steven Bennett, Zhiyi Li, Hitarth Choubisa, Arthur Hussey, Jonathan Godwin

Abstract

We introduce Orb, a family of universal interatomic potentials for atomistic modelling of materials. Orb models are 3-6 times faster than existing universal potentials, stable under simulation for a range of out of distribution materials and, upon release, represented a 31% reduction in error over other methods on the Matbench Discovery benchmark. We explore several aspects of foundation model development for materials, with a focus on diffusion pretraining. We evaluate Orb as a model for geometry optimization, Monte Carlo and molecular dynamics simulations.

Follow Us on

0 comments

Add comment