Skip to content

KAIST-Visual-AI-Group/BezierFlow

Repository files navigation

BézierFlow: Learning Bézier Stochastic Interpolant Schedulers for Few-Step Generation

Yunhong Min* · Juil Koo* · Seungwoo Yoo · Minhyuk Sung (* Equal Contribution)

KAIST

ICLR 2026

Paper PDF Project Page

BezierFlow Teaser
We introduce BézierFlow, a lightweight training approach for few-step generation with pretrained diffusion and flow models. BézierFlow achieves a 2–3× performance improvement for sampling with ≤ 10 NFEs while requiring only 15 minutes of training.

News

  • [2026.04.13] 🚀 We have released the implementation of BézierFlow: Learning Bézier Stochastic Interpolant Schedulers for Few-Step Generation.
  • [2026.01.27] 🔥 Our work has been accepted to ICLR 2026.

Environment and Requirements

Tested Environment

  • Python: 3.10
  • CUDA: 12.4
  • GPU: Tested on NVIDIA RTX 3090 and RTX A6000

Installation

conda create -n bezierflow python=3.10 -y
conda activate bezierflow
pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu124
pip install -r requirements.txt

Pretrained Checkpoints and FID References

mkdir -p pretrained fid-refs

1) Model checkpoints (pretrained/)

Model Source Filename
EDM (CIFAR-10) NVlabs/edm edm-cifar10-32x32-uncond-vp.pkl
EDM (FFHQ) NVlabs/edm edm-ffhq-64x64-uncond-vp.pkl
EDM (AFHQv2) NVlabs/edm edm-afhqv2-64x64-uncond-vp.pkl
Rectified Flow RectifiedFlow reflow_1.pth
FlowDCN MCG-NJU/FlowDCN FlowDCN-XL-2M-R256.pth

2) FID reference statistics (fid-refs/)

Dataset Source Filename
CIFAR-10 32x32 NVlabs/edm cifar10-32x32.npz
FFHQ 64x64 NVlabs/edm ffhq-64x64.npz
AFHQv2 64x64 NVlabs/edm afhqv2-64x64.npz
ImageNet 256x256 openai/guided-diffusion VIRTUAL_imagenet256_labeled.npz

Usage

Below are representative examples on CIFAR-10 for both a diffusion model (EDM) and a flow model (Rectified Flow). See configs/ for all supported model configurations.

Generate Teacher Data

Generate (latent, image) pairs from the pretrained model using the RK45 solver (can be different solvers):

  • EDM (Diffusion)

    python gen_data.py \
        --all_config configs/cifar10_edm.yml \
        --total_samples 400 --sampling_batch_size 10 \
        --steps 1000 --solver_name rk45 --skip_type edm \
        --save_pt --save_png
  • Rectified Flow (Flow)

    python gen_data.py \
        --all_config configs/cifar10_reflow.yml \
        --total_samples 400 --sampling_batch_size 10 \
        --steps 1000 --solver_name rk45 --skip_type rf \
        --save_pt --save_png

Train BézierFlow

Optimize the Bézier noise scheduler with specific ODE solver and target NFE. Use the --low_gpu flag to enable gradient checkpointing if you run out of memory.

  • EDM (Diffusion)

    python train.py \
        --all_config configs/cifar10_edm.yml \
        --solver_name uni_pc --steps 10
  • Rectified Flow (Flow)

    python train.py \
        --all_config configs/cifar10_reflow.yml \
        --solver_name midpoint --steps 10

Evaluate (FID)

Sample images using the learned schedule and compute FID:

  • EDM (Diffusion)

    python compute_fid.py \
        --all_config configs/cifar10_edm.yml \
        --total_samples 50000 --sampling_batch_size 150 \
        --solver_name uni_pc --steps 10 \
        --load_from all_logs/cifar10_logs/
  • Rectified Flow (Flow)

    python compute_fid.py \
        --all_config configs/cifar10_reflow.yml \
        --total_samples 50000 --sampling_batch_size 150 \
        --solver_name midpoint --steps 10 \
        --load_from all_logs/cifar10_logs/

Supported Models and Solvers

Config Model Dataset Resolution Supported Student Solvers
cifar10_edm.yml EDM CIFAR-10 32×32 UniPC, iPNDM
ffhq.yml EDM FFHQ 64×64 UniPC, iPNDM
afhqv2.yml EDM AFHQv2 64×64 UniPC, iPNDM
cifar10_reflow.yml Rectified Flow CIFAR-10 32×32 Euler (RK1), Midpoint (RK2)
flowdcn_imagenet.yml FlowDCN ImageNet 256×256 Euler (RK1), Midpoint (RK2)

Citation

If you find our work useful, please consider citing our paper:

@inproceedings{min2026bezierflow,
    title={B\'{e}zierFlow: Learning B\'{e}zier Stochastic Interpolant Schedulers for Few-Step Generation},
    author={Min, Yunhong and Koo, Juil and Yoo, Seungwoo and Sung, Minhyuk},
    booktitle={International Conference on Learning Representations (ICLR)},
    year={2026}
}

Acknowledgements

This repository builds upon the following projects:

  • LD3 (ICLR 2025, Tong et al.)
  • EDM (NeurIPS 2022, Karras et al.)
  • RectifiedFlow (ICLR 2023, Liu et al.)
  • FlowDCN (NeurIPS 2024, Wang et al.)
  • UniPC (NeurIPS 2023, Zhao et al.)

About

[ICLR 2026] Official code for BézierFlow: Learning Bézier Stochastic Interpolant Schedulers for Few-Step Generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages