기본 정보
연구 분야
프로젝트
발행물
구성원
preprint|
green
·인용수 0
·2025
Instruction-Guided Autoregressive Neural Network Parameter Generation
Soro Bedionita, Bruno Andreis, Song Chong, Sung Ju Hwang
ArXiv.org
초록

Learning to generate neural network parameters conditioned on task descriptions and architecture specifications is pivotal for advancing model adaptability and transfer learning. Existing methods especially those based on diffusion models suffer from limited scalability to large architectures, rigidity in handling varying network depths, and disjointed parameter generation that undermines inter-layer coherence. In this work, we propose IGPG (Instruction Guided Parameter Generation), an autoregressive framework that unifies parameter synthesis across diverse tasks and architectures. IGPG leverages a VQ-VAE and an autoregressive model to generate neural network parameters, conditioned on task instructions, dataset, and architecture details. By autoregressively generating neural network weights' tokens, IGPG ensures inter-layer coherence and enables efficient adaptation across models and datasets. Operating at the token level, IGPG effectively captures complex parameter distributions aggregated from a broad spectrum of pretrained models. Extensive experiments on multiple vision datasets demonstrate that IGPG consolidates diverse pretrained models into a single, flexible generative framework. The synthesized parameters achieve competitive or superior performance relative to state-of-the-art methods, especially in terms of scalability and efficiency when applied to large architectures. These results underscore ICPG potential as a powerful tool for pretrained weight retrieval, model selection, and rapid task-specific fine-tuning.

키워드
Autoregressive modelArtificial neural networkComputer scienceNonlinear autoregressive exogenous modelSETARSTAR modelEconometricsArtificial intelligenceMachine learningMathematics
타입
preprint
IF / 인용수
- / 0
게재 연도
2025