I’m very happy to report that our autoregressive predictions with generative diffusion models is finally accepted 😁 Congratulations Georg!

It’s been a long journey, this paper was first submitted to NeurIPS’23, and now, almost three years later and after several other stages, finally got into “Neural Networks” https://www.sciencedirect.com/science/article/pii/S0893608026001036

Despite the long wait, I think the main conclusions of our paper are still valid:

  • even vanilla diffusion / flow matching models, without special modifications, perform great for time predictions
  • deterministic models with proper unrolling can match them
    Importantly, both can lead to “unconditionally stable” surrogates, ie, NN operators that don’t blow up

Code & benchmark can be found here: https://github.com/tum-pbs/autoreg-pde-diffusion