기본 정보
연구 분야
프로젝트
논문
구성원
article|
인용수 2
·2023
Class relationship‐based knowledge distillation for efficient human parsing
Yuqi Lang, Kunliang Liu, Jianming Wang, Wonjun Hwang
IF 0.7Electronics Letters
초록

Abstract In computer vision, human parsing is challenging due to its demand for accurate human region location and semantic partitioning. This dense prediction task needs powerful computation and high‐precision models. To enable real‐time parsing on resource‐limited devices, the authors introduced a lightweight model using ResNet18 as a core network . The authors simplified the pyramid module, improving context clarity and reducing complexity. The authors integrated a spatial attention fusion strategy to counter precision loss in the light‐weighting process. Traditional models, despite their segmentation precision, are limited by their computational complexity and extensive parameters. The authors implemented knowledge distillation (KD) techniques to enhance the authors’ lightweight network's accuracy. Traditional methods can fail to learn useful knowledge with significant network differences. Hence, the authors used a novel distillation approach based on inter‐class and intra‐class relations in prediction outcomes, noticeably improving parsing accuracy. The authors’ experiments on the Look into Person (LIP) dataset show that their lightweight model significantly reduces parameters while maintaining parsing precision and enhancing inference speed.

키워드
Computer scienceParsingArtificial intelligenceMachine learningContext (archaeology)InferenceWeightingProcess (computing)Class (philosophy)Benchmark (surveying)
타입
article
IF / 인용수
0.7 / 2
게재 연도
2023

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.