Physics-Informed Machine Learning

Physics-Informed Machine Learning (PIML) is an emerging paradigm that integrates scientific domain knowledge (physical laws, constraints, or models) into machine learning workflows. In engineering systems—especially dynamic systems governed by differential equations—purely data-driven models often struggle with generalization and physical consistency. PIML addresses these challenges by blending first-principles physics with data-driven learning, thereby reducing required data, improving extrapolation, and ensuring that model outputs obey known physical rules. This survey outlines key methods to inject physics into ML models, presents code examples for simple dynamic systems (e.g. damped oscillators, reactors, thermal systems), and reviews both foundational and recent literature. Optimization applications, such as model predictive control (MPC), are emphasized alongside forward simulation.

I. Methods of Injecting Physics into ML Models

PIML techniques can be broadly categorized into several approaches which, when combined appropriately, yield models that are both accurate and interpretable.

A. Physics-Based Feature Engineering

Feature engineering uses physical insights—such as conservation laws, symmetry, and nondimensional numbers—to construct input features that better capture underlying dynamics.

For example, consider a pendulum where the true acceleration depends on sin(θ) rather than θ itself. Using sin(θ) as a feature allows even a linear regression model to capture the nonlinear dependence of acceleration on the angle.

B. Custom Neural Network Architectures with Physical Structure

Neural network architectures can be designed so that they inherently respect physical laws. Examples include Hamiltonian Neural Networks (HNNs) and Lagrangian Neural Networks. These models embed conservation laws into their topology, such that—for instance—a Hamiltonian network is constructed so that   dq/dt = ∂H/∂p   dp/dt = –∂H/∂q are satisfied automatically.

C. Physics-Based Constraints in the Loss Function (Soft Constraints)

Physics-Informed Neural Networks (PINNs) integrate the governing equations directly into the loss function as penalty terms. The loss penalizes the residual of the underlying PDE or ODE, so that the neural network learns a solution that satisfies both the data and the physics.

D. Hybrid Modeling with Differential Equations or Simulators

Hybrid modeling—also known as gray-box modeling—combines known physics with data-driven ML components. This can take the form of serial, parallel, or component-replacement strategies.

For instance, one can replace an unknown reaction rate term in a chemical process model with a neural network learned from data, and then embed this hybrid model into an ODE solver.

E. Data Assimilation and Regularization with Physical Priors

Data assimilation techniques—such as Kalman filters or Moving Horizon Estimation (MHE)—continuously update the model state or parameters by blending model predictions with measurements. Regularization with physical priors (e.g., smoothness, monotonicity) further guides the learning towards physically plausible solutions.

II. Comparison of PIML Techniques

ApproachStrengthsLimitations
Feature EngineeringSimple to implement; leverages domain insight; improves interpretabilityRequires accurate a priori identification of key physical features; may not capture complex dynamics fully
Custom Neural ArchitecturesGuarantees adherence to physical laws (e.g. conservation) by design; yields physically interpretable representationsRequires specialized knowledge for design; less flexible if system deviates from assumed physics
Physics-Based Loss (PINNs)General and flexible; learns solutions that satisfy PDE/ODE constraints even with limited data; naturally addresses inverse problemsTraining can be slow and challenging; balancing physics and data loss requires careful tuning
Hybrid ModelingCombines the strengths of first-principles and ML; improved accuracy with limited data; high interpretability of known physics componentsIntegration can be complex; interfacing ML with simulators requires careful scaling and calibration
Data Assimilation & RegularizationAdaptive correction with real-time data; enhances robustness; gently guides the solution towards physical plausibilityAdditional computational overhead; tuning of assimilation parameters is often problem-dependent

III. Conclusion

Physics-informed machine learning represents a significant advancement in bridging the gap between black-box AI and traditional engineering practices. By embedding physical laws into the ML modeling process—whether through feature engineering, custom neural architectures, physics-based loss functions, hybrid modeling, or data assimilation—engineers obtain models that are more data-efficient, generalizable, and interpretable. This synthesis of physics and data empowers more reliable dynamic optimization and control strategies, as demonstrated by the presented MPC case study for a chemical reactor.

Looking ahead, active research is exploring not only the parameters but also the discovery of entire physical laws via data-driven methods, further enhancing the robustness and interpretability of PIML in complex engineering applications.

Case Study

Physics-Informed Learning of Thermophysical Properties

IV. References

  • Karniadakis, G. E., Kevrekidis, I. G., et al. (2021). Physics-informed machine learning. Nat. Rev. Phys., 3(6), 422–440. NSGA-PINN for Physics-Informed Neural Network Training
  • Willard, J., Jia, X., et al. (2022). Integrating Scientific Knowledge with Machine Learning for Engineering and Environmental Systems. ACM Comput. Surv. 55(4): 73. Integrating Scientific Knowledge with Machine Learning for Engineering and Environmental Systems
  • Raissi, M., Perdikaris, P., & Karniadakis, G. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear PDEs. J. Comput. Phys., 378, 686–707.
  • Greydanus, S., Dzamba, M., & Yosinski, J. (2019). Hamiltonian Neural Networks. NeurIPS 2019. Hamiltonian Neural Networks
  • Karpatne, A., Watkins, W., Read, J., & Kumar, V. (2017). Physics-guided Neural Networks (PGNN): An Application in Lake Temperature Modeling. SIAM Int. Conf. on Data Mining (SDM). Integrating Scientific Knowledge with Machine Learning for Engineering and Environmental Systems
  • Parish, E. & Duraisamy, K. (2016). A paradigm for data-driven predictive modeling using field inversion and machine learning. J. Comput. Phys., 305, 758–774.
  • GEKKO Documentation (2020). Machine Learning in GEKKO. Machine Learning — GEKKO documentation
  • Sanyal, S. & Roy, K. (2023). RAMP-Net: Robust Adaptive MPC for Quadrotors via Physics-informed Neural Networks. IEEE Int. Conf. Robotics and Automation (ICRA).
  • Zheng, Y. & Wu, W. (2023). Physics-informed recurrent neural network based MPC for chemical processes. J. Process Control, 118, 65–77.
  • Patel, D. et al. (2024). Model Predictive Control Using Physics Informed Neural Networks for Process Systems. ADCHEM 2024, IFAC-PapersOnLine, 57(9), 276–281.
  • Gunnell, L., Lu, X., Vienna, J.D., Kim, D-S, Riley, B.J., Hedengren, J.D., Uncertainty propagation and sensitivity analysis for constrained optimization of nuclear waste vitrification, 2025, doi: 10.1111/jace.20446 Article (Open-Access)
  • Arce Munoz, S., Hedengren, J.D., Transfer Learning for Thickener Control, Processes, Special Issue: Machine Learning Optimization of Chemical Processes, 2025, 13, 223, doi: 10.3390/pr13010223 Article
  • Arce Munoz, S., Pershing, J., Hedengren, J.D., Physics-Informed Transfer Learning for Process Control Applications, Industrial & Engineering Chemistry Research, 2024, doi: 10.1021/acs.iecr.4c02781 Article
  • Gunnell, L., Nicholson, B., Hedengren, J.D., Equation-based and data-driven modeling: Open-source software current state and future directions, Computers & Chemical Engineering, 2024, 108521, ISSN 0098-1354, DOI: 10.1016/j.compchemeng.2023.108521. Article
  • Park, J., Babaei, M.R., Arce Munoz, S., Venkat, A.N., Hedengren, J.D., Simultaneous Multistep Transformer Architecture for Model Predictive Control, Computers & Chemical Engineering, Volume 178, October 2023, 108396, DOI: 10.1016/j.compchemeng.2023.108396 Article
  • Babaei, M.R., Stone, R., Knotts, T.A., Hedengren, J.D., Physics-Informed Neural Networks with Group Contribution Methods, Journal of Chemical Theory and Computation, American Chemical Society, 2023, DOI: 10.1021/acs.jctc.3c00195. Article
Streaming Chatbot
💬