기본 정보
연구 분야
프로젝트
논문
구성원
article|
인용수 0
·2025
Enhancing Mixture of Experts with Independent and Collaborative Learning for Long-Tail Visual Recognition
Yanhao Chen, Zhongquan Jian, Nuo Ke, S. G. Hu, Junjie Jiao, Qingqi Hong, Qingqiang Wu
초록

Deep neural networks (DNNs) face substantial challenges in Long-Tail Visual Recognition (LTVR) due to the inherent class imbalances in real-world data distributions. The Mixture of Experts (MoE) framework has emerged as a promising approach to addressing these issues. However, in MoE systems, experts are typically trained to optimize a collective objective, often neglecting the individual optimality of each expert. This individual optimality usually contributes to the overall performance, as the goals of different experts are not mutually exclusive. We propose the Independent and Collaborative Learning (ICL) framework to optimize each expert independently while ensuring global optimality. First, Diverse Optimization Learning (DOL) is introduced to enhance expert diversity and individual performance. Then, we conceptualize experts as parallel circuit branches and introduce Competition and Collaboration Learning (CoL). Competition Learning amplifies the gradients of better-performing experts to preserve individual optimality, and Collaboration Learning encourages collaboration through mutual distillation to enhance optimal knowledge sharing. ICL achieves state-of-the-art accuracy in experiments on CIFAR-100/10-LT, ImageNet-LT, and iNaturalist 2018, respectively. Our code is available at https://github.com/PolarisLight/ICL.

키워드
Collaborative learningClass (philosophy)Competition (biology)Facial recognition systemMatching (statistics)Code (set theory)Face (sociological concept)
타입
article
IF / 인용수
- / 0
게재 연도
2025

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.