In the past decade, deep learning has revolutionized AI, making breakthroughs in image recognition, language modeling and gameplay. However, continuous limitations emerge: inefficient data, lack of robustness in allocation changes, high energy demands, and a superficial mastery of the laws of physics. As AI adoption deepens to key sectors (from climate forecasting to medicine), these restrictions become untenable.
A promising paradigm is emerging: Physical-based AI, where learning is limited and guided by the laws of nature. Inspired by hundreds of years of scientific advancement, this hybrid approach embeds the principles of physics into machine learning models, providing new avenues for universality, interpretability, and reliability. The question is no longer whether we need to go beyond the black box learning, but how long will it take for us to realize this transition.
The situation based on physical AI
Why do you need physics now?
Contemporary AI (particularly LLM and visual models) is difficult to extract correlations from large-scale (usually unstructured datasets). This data-driven approach performs poorly in environments where data screening, safety is critical or physically controlled. In contrast, physics-based AI utilization:
- Inductive biases are physically constrained: Embedding symmetry, protection laws and invariance narrow down the hypothesis space and guide learning of feasible solutions.
- Sample efficiency: Using a physical prior model achieves more with less data, a key advantage in areas such as healthcare and computing science.
- Robustness and summary: Unlike black boxes, models of physical information do not easily predict failure when inferring distributions.
- Explanation and trust: Observing predictions that are known laws (such as energy savings) are more trustworthy and explainable.
Landscape based on physical AI
Physical Information Neural Network: The Main Force
A neural network of physical knowledge (PINN) integrates physical knowledge by punishing violations of management equations (usually PDEs) in the loss function. Over the past few years, this has grown into a rich ecosystem:
- In climate and geoscience, Pinn’s powerful predictions of free surface flows with terrain complexity.
- In materials science and fluid dynamics, they model stress distribution, turbulence and nonlinear wave propagation with attractive efficiency.
- In biomedical modeling, PINN accurately simulates cardiac dynamics and tumor development under sparse observations.
Latest developments (2024–2025):
- Unified error analysis now provides strict PINN error decomposition, shifting the focus to more effective training strategies.
- PointNet, a physical ideology, enables PINN-based solutions without the need for per-geometric regeneration.
- Next-generation PINNS adopts a multi-modal architecture that mixes data-driven and physically booted components to cope with partial observability and heterogeneity.
Neurooperator: Physics across infinite fields
Classical machine learning models are limited in dealing with differences in physical equations and boundary conditions. Neural operators, especially Fourier neural operators (FNOs), learn mappings between functional spaces:
- In weather forecasts, FNO outperforms CNN in capturing nonlinear ocean and atmospheric dynamics.
- Their limitations (such as low-frequency bias) have been addressed through ensemble and multi-scale operator techniques, improving the accuracy of high-frequency predictions.
- Transnational and multiscale neural operators now set the latest status quo in global weather forecasts.
Distinguishable simulation: Data physical fusion main chain
Distinguishable simulators allow end-to-end optimization of physical predictions by learning:
- In haptic and contact physics, distinguishable simulators can be learned in contact rich manipulation, software and rigid physics schemes.
- In neuroscience, distinguishable simulations bring large-scale gradient-based optimization to neural circuits.
- New physics engines such as Genesis provide unprecedented simulation speed and scale for learning and robotics.
Recent work has identified several major approaches to distinguish contacts – LCP-based, convex optimization-based, compliance and location-based dynamic models.
Mixed Physical ML Model: The Best of Both Worlds
- In tropical cyclone prediction, hybrid neural physics models combine data-driven learning with explicit physical code to bring the prediction perspective far beyond previous limitations.
- In manufacturing and engineering, hybrid vehicles use both empirical and physical constraints to overcome the vulnerability of models based solely on black-box data or first principles.
- In climate science, hybrid approaches can make physically reasonable reductions and uncertain perception predictions.
Current Challenges and Research Frontiers
- Scalability: Effective training of physically constrained models remains challenging on a large scale, with advances in mesh-free operators and simulation speeds continuing.
- Partial observability and noise: Handling noisy parts of data is an open research challenge. Recent hybrid and multi-model models are addressing this issue.
- Integrate with the basic model: The focus of the study is to integrate a general AI model with a clear physical prior.
- Verification and Verification: Ensuring that models comply with physical laws in all institutions remains technically required.
- Automation method discovers: Pinn-inspired approaches are making data-driven scientific laws more practical.
The future: moving towards a physics-first AI paradigm
Transfers transformations based on physical and hybrid models require not only AI, but are crucial to the intelligence that can infer, rationally and potentially discover new scientific laws. Promising directions include:
- Neural symbol fusion combines interpretable physical knowledge with deep web.
- Real-time, mechanism-aware artificial intelligence for trusted decision-making for robotics and digital twins.
- Automated scientific discoveries using advanced machine learning for causal reasoning and legal discoveries.
These breakthroughs depend on strong collaboration between machine learning, physics and domain experts. Explosive advances in this field are uniting data, computing and domain knowledge, which promises to become a new generation of AI capabilities in science and society.
refer to
- Physical Information Neural Network: A deep learning framework for solving advancement and inverse problems involving nonlinear partial differential equations, Raissi et al. (2019)
- Lagrangian neural network, Cranmer, etc. (2020)
- Hamiltonian neural network, Greydanus et al. (2019)
- Fourier Neural Operator of Parameter Partial Differential Equations, Li et al. (2021)
- Neural Operator: Learning Map between Functional Spaces, Kovachki et al. (2021)
- Scientific machine learning through neural networks in physics: Where are we, what’s next, Cuomo, etc. (2022)
- Numerical analysis of physical knowledge neural networks and related models by De Ryck et al. (2024)
- Physical Information Neural Networks and Extensions, Raissi et al. (2024)
- The spherical multi-immigration neural operators of Hu et al. are used to improve autoregressive global weather forecasts. (2025)
- Choi et al.’s application in regional ocean modeling and prediction. (2024)
- Dazzi et al. (2024)
- Difftaichi: Differentiable programming for physical simulation, Hu et al. (2020)
- Digftactile: A physics-based distinguishable haptic simulator for contact-rich robot manipulation, Si, etc. (2024)
- A review of the distinguishable simulator by Newbury et al. (2024)
- Distinguishable physical simulations with contacts: do they have the correct gradient WRT position, speed and control? , Zhong et al. (2022)
- A hybrid machine learning/physics-based modeling framework for 2-week extended prediction by Liu et al. (2024)
- Jaxley: Detailed biophysical models of Deistler et al. can be trained at large scale, Deistler et al. (2024)
- Revolutionary physics: a comprehensive survey of machine learning applications, Suresh et al. (2024)
- Learning the library of neural operators, Kossaifi et al. (2024); github
- Genesis: Robotics and a universal physics platform that embodies AI, Genesis embodies AI team (2024)
- Beucler et al. perform analytical constraints in neural networks that mimic physical systems. (2021)

Michal Sutter is a data science professional with a master’s degree in data science from the University of Padua. With a solid foundation in statistical analysis, machine learning, and data engineering, Michal excels in transforming complex data sets into actionable insights.