A pseudo-reversible normalizing flow for stochastic differential equation and uncertainty quantification
We present a pseudo-reversible normalizing flow method for efficiently generating samples of the state of a stochastic differential equation (SDE) with various initial distributions. Once trained,the normalizing flow model can directly generate samples of the SDE's final state without simulating trajectories. Notably, the model requires only a single training session, after which it can accommodate a range of initial distributions. This feature can provide a significant computational saving in studies of how the final state varies with the initial distribution. Additionally, we introduce a conditional pseudo-reversible normalizing flow for quantifying forward and inverse uncertainty propagation. The convergence analysis in the Kullback–Leibler divergence and numerical experiments will be provided.
Last modified: Wed Mar 27 14:13:04 EDT 2024