To main content

Machine learning meets physical modeling

Hybrid data-driven and physics-informed models combine the strengths of machine learning with traditional physical simulations to enhance accuracy, speed, and interpretability.

Contact persons

Purely data-driven models often struggle to generalize beyond their training data—especially in complex systems where observations are sparse or noisy. Physics-based models, grounded in fundamental laws, offer strong generalization but can be computationally expensive or rely on simplifications. Hybrid modeling techniques combine these approaches to produce models that are both accurate and efficient. This is particularly valuable for subsurface systems, where direct measurements are limited and uncertainty is high.

What do we do?

We develop hybrid modeling methods that integrate physical insight with machine learning to create fast, accurate flow models for complex systems. Our primary focus is on subsurface applications—including hydrocarbon recovery, CO₂ storage, geothermal energy, and gas storage—where uncertainty is high and data availability is limited.

🧠 Treating the simulator as a neural network

In our CGNet approach, we use a fully differentiable flow simulator as a trainable model—conceptually analogous to a neural network. The model architecture is defined by the computational graph induced by a finite-volume discretization over a coarse grid (or a coarse partition of a more accurate grid), where physical properties such as pore volumes and transmissibilities act as tunable weights. These are calibrated using gradient-based optimization, with automatic differentiation and adjoint methods providing the equivalent of backpropagation.

CGNet can be trained to match observed field data or to emulate high-fidelity simulation output—offering rapid, physics-consistent predictions at reduced computational cost.

  Illustration of a CGNet for the Norne oil and gas field
  • K.-A. Lie and S. Krogstad. Data-driven modelling with coarse-grid network models. Computational Geosciences, 2023. DOI: 10.1007/s10596-023-10237-y
  • K.-A. Lie and S. Krogstad. Comparison of two different types of reduced graph-based reservoir models: Interwell networks (GPSNet) versus aggregated coarse-grid networks (CGNet). Geoenergy Science and Engineering, 221, Feb. 2023, 111266. DOI: 10.1016/j.petrol.2022.111266
  • B. Aslam, B. Yan, K.-A. Lie, S. Krogstad, O. Møyner, and X. He. A novel hybrid physics/data-driven model for fractured reservoir simulationSPE Journal, 2024. DOI: 10.2118/219110-PA

🔁 Other hybrid flow models

CGNet is one of several hybrid approaches we develop. We also work with numerical interwell network models like GPSNet, FlowNet, and StellNet, which use simplified flow networks trained on field or simulation data to predict reservoir behavior quickly.

For unconventional resources such as shale oil, we use calibrated 1D models to predict production from hydraulically fractured, ultra-low-permeability systems, capturing fracture–matrix interactions and transient flow dynamics.

  Illustration of a GPSNet for the Norne oil and gas field
  • S. Krogstad, M. A. Jakymec, A. Kianinejad, D. Pertuso, S. Matringe, A. Brostrom, J. Torben, O. Møyner, K.-A. Lie. Reduced physics-based simulation for unconventional production forecasting – A 1D approach. The Unconventional Resources Technology Conference (URTeC), Houston, June 9–11, 2025. URTeC: 4253913.
  • M. A. Borregales Reverón, H. H. Holm, O. Møyner, S. Krogstad, and K.-A. Lie. Numerical comparison between ES-MDA and gradient-based optimization for history matching of reduced reservoir modelsSPE Reservoir Simulation Conference, Galveston, Texas, USA, 3–5 October, 2021. DOI: 10.2118/203975-MS
  • M. Borregales, O. Møyner, S. Krogstad, K.-A. Lie. Data-driven models based on flow diagnosticsECMOR XVII - 17th European Conference on the Mathematics of Oil Recovery, 2020. DOI: 10.3997/2214-4609.202035122

📚 Standard machine learning approaches

We have experience applying established techniques like Physics-Informed Neural Networks (PINNs), Fourier Neural Operators (FNOs), Pseudo-Hamiltonian Neural Networks (PHNN), and other physics-aware architectures. These methods can be valuable when parts of the physics are unknown or when full-scale simulation is too computationally expensive.

In addition, we have developed novel and fast surrogate models for estimating observables from simulators by combining machine learning with state-of-the-art uncertainty quantification techniques. These surrogate models enable efficient gradient-based optimization even on black-box simulators. By utilizing multi-fidelity data for training, we achieve better accuracy for the same computational cost.

  Mach number contours around an airfoil before (left) and after (right) shape optimization with our active learning algorithm.
  • K. O. Lye, M. V. Tabib, and K. A. Johannessen. A reinforcement learning framework for wake steering of wind turbines. Journal of Physics: Conference Series, 2626, 2023. DOI: 10.1088/1742-6596/2626/1/012051
  • S. Eidnes and K. O. Lye. Pseudo-Hamiltonian neural networks for learning partial differential equations. Journal of Computational Physics, 500, Mar. 2024, 112738. DOI: 10.1016/j.jcp.2023.112738
  • K. O. Lye, S. Mishra, D. Ray, and P. Chandrashekar. Iterative surrogate model optimization (ISMO): An active learning algorithm for PDE constrained optimization with deep neural networks. Computer Methods in Applied Mechanics and Engineering, 374, 2021. DOI: 10.1016/j.cma.2020.113575
  • K. O. Lye, S. Mishra, and R. Molinaro. A multi-level procedure for enhancing accuracy of machine learning algorithms. European Journal of Applied Mathematics, 32(3), 2021, 436–469. DOI: 10.1017/S0956792520000224
  • K. O. Lye, S. Mishra, and D. Ray. Deep learning observables in computational fluid dynamics. Journal of Computational Physics, 410, 2020, 109339. DOI: 10.1016/j.jcp.2020.109339

🧩 System identification and inverse modeling

We explore how to learn governing equations directly from data, including system identification for non-linear PDEs. Our preliminary tests enforce known physical constraints by altering assumptions on the learned Hamiltonian, yielding interpretable and physically plausible models.

We are also developing methods to embed neural networks implicitly into simulators, enabling solutions to inverse problems by combining differentiable simulators, functional approximations (e.g., neural networks), and a priori constraints.

  Use of PHNN for a Cahn-Hillard problem
  • S. Eidnes and K. O. Lye. Pseudo-Hamiltonian neural networks for learning partial differential equations. Journal of Computational Physics, 500, Mar. 2024, 112738. DOI: 10.1016/j.jcp.2023.112738

🔄 Neural networks as correctors

We are currently exploring predictor–corrector hybrid frameworks, where a calibrated simulator or a physics-based model like CGNet serves as the predictor and a neural network acts as the corrector. The predictor enforces core physical principles, while the corrector improves accuracy by learning from mismatches with observed data. This approach preserves physical consistency while improving prediction fidelity.

  Use of a neural network to correct predictions by a trained CGNet
 

🌍Supporting physics-informed digital twins

Our hybrid models form a foundation for building digital twins of subsurface systems—dynamic, real-time simulations that integrate operational data and sensor inputs. These digital twins are designed to improve understanding of complex processes and support more effective decision-making throughout the asset lifecycle. By combining fast surrogate models with continuous data updates, they enable forecasting, monitoring, and optimization under uncertainty.

Much of our work in this area has focused on geothermal energy systems, where real-time modeling and data integration are essential for efficient resource management, performance prediction, and operational planning.

  A digital twin of a geothermal plant