기본 정보
연구 분야
프로젝트
논문
구성원
article|
인용수 1
·2024
Poster: Clipped Quantization and Huffman Coding for Efficient Secure Transfer in Federated Learning
Seung‐Ho Lim, Min Choi, Ki-Woong Park
초록

Federated Learning(FL) has become an emerging method that trains private data by distributed learning of shared parameter, however, it has high communication overhead and is still exposed to attack on the model parameters. To minimize the communication overhead of federated learning while preserving its accuracy and security, we considered a combination of techniques with gradient quantization, clipping, and Huffman coding. Our system produces reduced parameters through quantization and clipping, then encodes the parameters through Huffman coding, which further increases the compression ratio as well as security of the parameters to be transmitted. Preliminary results identify that the scheme can significantly reduce the amount of transferring while preserving accuracy and security.

키워드
Huffman codingComputer scienceQuantization (signal processing)Tunstall codingCoding (social sciences)Transfer of learningSpeech recognitionArtificial intelligenceTheoretical computer scienceData compression
타입
article
IF / 인용수
- / 1
게재 연도
2024

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.