Abstract
Deep operator networks (DeepONets, DONs) offer a distinct advantage over traditional neural networks in their ability to learn from data collected at heterogeneous resolutions. This property, known as discretization invariance, can be particularly valuable when modeling fine-scale temporal dynamics with limited high-resolution data, provided that lower-resolution data are available. Nevertheless, DeepONets alone often struggle to capture and maintain dependencies over long sequences compared to other state-of-the-art algorithms.
We propose a novel framework that leverages multi-resolution data in training and provides precise models when dealing with limited high-resolution data. We achieve this through extending the DeepONet architecture with a long short-term memory network (LSTM), and training it in a three-step procedure that utilizes data of different levels of granularity. Combining these two architectures, we equip the network with explicit mechanisms to leverage multi-resolution data, as well as capture temporal dependencies in long sequences. We test our method on long-time-evolution modeling of multiple non-linear systems and show that the proposed multi-resolution DON-LSTM achieves significantly lower generalization error and requires fewer high-resolution samples compared to its vanilla counterparts.