기본 정보
연구 분야
프로젝트
논문
구성원
article|
green
·인용수 0
·2026
Stabilizing Consistency Training: A Flow Map Analysis and Self-Distillation
Youngjoong Kim, Duhoe Kim, Woosung Kim, Jaesik Park
ArXiv.org
초록

Consistency models have been proposed for fast generative modeling, achieving results competitive with diffusion and flow models. However, these methods exhibit inherent instability and limited reproducibility when training from scratch, motivating subsequent work to explain and stabilize these issues. While these efforts have provided valuable insights, the explanations remain fragmented, and the theoretical relationships remain unclear. In this work, we provide a theoretical examination of consistency models by analyzing them from a flow map-based perspective. This joint analysis clarifies how training stability and convergence behavior can give rise to degenerate solutions. Building on these insights, we revisit self-distillation as a practical remedy for certain forms of suboptimal convergence and reformulate it to avoid excessive gradient norms for stable optimization. We further demonstrate that our strategy extends beyond image generation to diffusion-based policy learning, without reliance on a pretrained diffusion model for initialization, thereby illustrating its broader applicability.

키워드
Consistency (knowledge bases)Convergence (economics)Stability (learning theory)Flow (mathematics)Generative grammarWork (physics)Degenerate energy levels
타입
article
IF / 인용수
- / 0
게재 연도
2026

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.