기본 정보
연구 분야
프로젝트
논문
구성원
preprint|
green
·인용수 0
·2026
Stabilizing Consistency Training: A Flow Map Analysis and Self-Distillation
Youngjoong Kim, Duhoe Kim, Woosung Kim, Jaesik Park
arXiv (Cornell University)
초록

Consistency models have been proposed for fast generative modeling, achieving results competitive with diffusion and flow models. However, these methods exhibit inherent instability and limited reproducibility when training from scratch, motivating subsequent work to explain and stabilize these issues. While these efforts have provided valuable insights, the explanations remain fragmented, and the theoretical relationships remain unclear. In this work, we provide a theoretical examination of consistency models by analyzing them from a flow map-based perspective. This joint analysis clarifies how training stability and convergence behavior can give rise to degenerate solutions. Building on these insights, we revisit self-distillation as a practical remedy for certain forms of suboptimal convergence and reformulate it to avoid excessive gradient norms for stable optimization. We further demonstrate that our strategy extends beyond image generation to diffusion-based policy learning, without reliance on a pretrained diffusion model for initialization, thereby illustrating its broader applicability.

키워드
Consistency (knowledge bases)Convergence (economics)Stability (learning theory)Flow (mathematics)Generative grammarWork (physics)Degenerate energy levels
타입
preprint
IF / 인용수
- / 0
게재 연도
2026