TL;DR: We introduce Functional Diffusion Processes, a continuous-time diffusion framework for infinite-dimensional function spaces. The resulting models derive practical reverse-time dynamics and training objectives while achieving high-quality image generation with simple low-parameter architectures.
Authors
Affiliations

Giulio Franzese

EURECOM

Giulio Corallo

EURECOM

Simone Rossi

Stellantis

Markus Heinonen

Aalto University

Maurizio Filippone

EURECOM

Pietro Michiardi

EURECOM

Published

March 1, 2023

Publication PDF Code

Abstract

We introduce Functional Diffusion Processes (FDPs), which generalize score-based diffusion models to infinite-dimensional function spaces. FDPs require a new mathematical framework to describe the forward and backward dynamics, and several extensions to derive practical training objectives. These include infinite-dimensional versions of Girsanov theorem, in order to be able to compute an ELBO, and of the sampling theorem, in order to guarantee that functional evaluations in a countable set of points are equivalent to infinite-dimensional functions.

We use FDPs to build a new breed of generative models in function spaces, which do not require specialized network architectures, and that can work with any kind of continuous data. Our results on real data show that FDPs achieve high-quality image generation, using a simple MLP architecture with orders of magnitude fewer parameters than existing diffusion models.

Key Contributions

  1. Diffusion in function spaces: We extend score-based diffusion modeling from finite-dimensional vectors to infinite-dimensional function spaces, enabling generative modeling directly over continuous objects.

  2. A practical infinite-dimensional training framework: The paper derives reverse-time dynamics and variational training objectives using infinite-dimensional versions of tools such as Girsanov theorem and the sampling theorem.

  3. Simple architectures remain effective: The framework does not require specialized diffusion backbones and supports practical implementations with lightweight score models.

  4. Strong image-generation results: Experiments show that Functional Diffusion Processes can generate high-quality images with a simple MLP and far fewer parameters than standard diffusion approaches.

Methodology Overview

Samples generated by Functional Diffusion Processes with a simple MLP score model.

Samples generated by Functional Diffusion Processes with a simple MLP score model.

The core idea is to formulate diffusion as a stochastic process over functions instead of finite-dimensional data points. This requires redefining both the forward noising dynamics and the reverse denoising dynamics in an infinite-dimensional setting, while preserving a tractable learning objective.

In practice, the framework connects continuous function representations with countable evaluations, which makes training and sampling implementable with standard neural components. This lets the method handle continuous data without resorting to bespoke diffusion architectures.

Main Findings

  • Score-based diffusion can be lifted to infinite-dimensional function spaces without losing a practical training-and-sampling pipeline.
  • Infinite-dimensional analysis provides the right tools to derive ELBO-based objectives and reverse-time dynamics for these models.
  • Functional Diffusion Processes work well with simple score networks, including MLPs.
  • On image generation, the method delivers strong sample quality with orders of magnitude fewer parameters than conventional diffusion models.
Back to top