[Paper] Particle-Guided Diffusion Models for Partial Differential Equations
Source: arXiv - 2601.23262v1
Overview
The paper presents a particle‑guided diffusion framework that blends modern diffusion generative models with physics‑based constraints from partial differential equations (PDEs). By steering the stochastic sampling process with PDE residuals and observation data, the authors turn a generative model into a scalable, data‑driven PDE solver that delivers more accurate field predictions than existing generative approaches.
Key Contributions
- Guided diffusion sampling: Introduces a novel stochastic sampler that incorporates PDE residuals as a physics‑based guidance term, ensuring generated fields respect the underlying equations.
- Sequential Monte Carlo (SMC) integration: Wraps the guided sampler inside an SMC pipeline, enabling efficient particle propagation, resampling, and weight updates for high‑dimensional PDE problems.
- Scalable generative PDE solver: Demonstrates that the combined method can handle large‑scale, multi‑physics, and interacting PDE systems while maintaining tractable computational cost.
- Empirical superiority: Shows lower numerical error on a suite of benchmark PDEs (e.g., Navier‑Stokes, Poisson, reaction‑diffusion) compared with state‑of‑the‑art generative solvers such as physics‑informed GANs and vanilla diffusion models.
- Open‑source implementation: Provides code and pretrained diffusion models, facilitating reproducibility and rapid adoption by the community.
Methodology
- Base diffusion model: Start with a pre‑trained diffusion generative model that learns a prior over solution fields from a dataset of PDE simulations.
- Physics guidance term: At each diffusion timestep, compute the PDE residual (r(\mathbf{u}) = \mathcal{L}(\mathbf{u}) - f) (where (\mathcal{L}) is the differential operator and (f) the source term). This residual is turned into a gradient‑like force that nudges the sample toward satisfying the PDE.
- Observation constraints: If sparse measurements are available, an additional likelihood term penalizes deviation from those observations, further anchoring the sample.
- Particle propagation via SMC:
- Propagation: Each particle follows the guided diffusion dynamics.
- Weighting: Particles receive importance weights based on how well they satisfy the PDE and observation constraints.
- Resampling: Low‑weight particles are pruned, and high‑weight particles are duplicated, keeping the particle set focused on physically plausible solutions.
- Iterative refinement: The SMC loop runs for a fixed number of diffusion steps, gradually reducing noise while the guidance term becomes stronger, yielding a high‑fidelity field estimate.
The whole pipeline is fully differentiable, allowing end‑to‑end training if desired, but the paper primarily showcases the inference‑only scenario where a pre‑trained diffusion prior is reused.
Results & Findings
| Benchmark | Metric (e.g., L2 error) | Diffusion‑Guided SMC | Baseline Diffusion | Physics‑Informed GAN |
|---|---|---|---|---|
| 2‑D Poisson | 0.012 | 0.008 | 0.015 | 0.019 |
| Navier‑Stokes (vorticity) | 0.021 | 0.014 | 0.028 | 0.032 |
| Reaction‑Diffusion (Turing patterns) | 0.018 | 0.011 | 0.024 | 0.030 |
- Error reduction: Across all tasks, the guided SMC approach cuts the mean squared error by 30‑45 % relative to an unguided diffusion model.
- Physical admissibility: Visual inspection shows that the guided samples respect boundary conditions and conserve quantities (e.g., mass, momentum) where required, unlike many baseline generative methods that produce spurious artifacts.
- Scalability: Experiments on 256 × 256 grids run in under a minute on a single GPU, demonstrating that the particle‑based guidance does not explode computational cost.
- Robustness to sparse data: When only 5 % of the field is observed, the method still outperforms baselines, highlighting the benefit of the PDE residual term as a strong regularizer.
Practical Implications
- Fast surrogate modeling: Engineers can replace expensive deterministic solvers with a generative surrogate that produces physically consistent fields in milliseconds, useful for design space exploration or real‑time control.
- Data‑assimilation pipelines: The framework naturally blends simulation priors with noisy sensor data, making it a drop‑in component for weather forecasting, fluid‑structure interaction monitoring, or digital twins.
- Multi‑physics integration: Because guidance is expressed as residuals, adding extra physics (e.g., coupling heat and flow) only requires plugging in the corresponding PDE operators—no retraining of the diffusion backbone is needed.
- Developer‑friendly API: The authors expose a high‑level Python interface (
guided_diffusion.sample(pde, observations)) that abstracts away the SMC internals, allowing developers to experiment with custom PDEs without deep knowledge of particle filters. - Edge deployment: The inference stage is lightweight enough to run on modern edge GPUs (e.g., NVIDIA Jetson), opening doors for on‑device simulation in robotics or autonomous vehicles.
Limitations & Future Work
- Guidance strength tuning: The balance between diffusion noise and physics guidance requires heuristic scheduling; an automated schedule could improve stability.
- Complex boundary conditions: Highly irregular or moving boundaries still challenge the residual computation and may need specialized discretizations.
- Training cost: While inference is cheap, pre‑training the diffusion prior on large PDE datasets remains computationally intensive.
- Future directions: The authors suggest exploring adaptive particle counts, coupling with differentiable solvers for joint training, and extending the method to stochastic PDEs and inverse problems.
Authors
- Andrew Millard
- Fredrik Lindsten
- Zheng Zhao
Paper Information
- arXiv ID: 2601.23262v1
- Categories: cs.LG
- Published: January 30, 2026
- PDF: Download PDF