Paper
8 June 2024 LoRA-SP: streamlined partial parameter adaptation for resource efficient fine-tuning of large language models
Yichao Wu, Yafei Xiang, Shuning Huo, Yulu Gong, Penghao Liang
Author Affiliations +
Proceedings Volume 13171, Third International Conference on Algorithms, Microchips, and Network Applications (AMNA 2024); 131711Z (2024) https://doi.org/10.1117/12.3032013
Event: 3rd International Conference on Algorithms, Microchips and Network Applications (AMNA 2024), 2024, Jinan, China
Abstract
In addressing the computational and memory demands of fine-tuning Large Language Models (LLMs), we propose LoRASP (Streamlined Partial Parameter Adaptation), a novel approach utilizing randomized half-selective parameter freezing within the Low-Rank Adaptation (LoRA) framework. This method efficiently balances pre-trained knowledge retention and adaptability for task-specific optimizations. Through a randomized mechanism, LoRA-SP determines which parameters to update or freeze, significantly reducing computational and memory requirements without compromising model performance. We evaluated LoRA-SP across several benchmark NLP tasks, demonstrating its ability to achieve competitive performance with substantially lower resource consumption compared to traditional full-parameter fine-tuning and other parameter-efficient techniques. LoRA-SP’s innovative approach not only facilitates the deployment of advanced NLP models in resource-limited settings but also opens new research avenues into effective and efficient model adaptation strategies
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Yichao Wu, Yafei Xiang, Shuning Huo, Yulu Gong, and Penghao Liang "LoRA-SP: streamlined partial parameter adaptation for resource efficient fine-tuning of large language models", Proc. SPIE 13171, Third International Conference on Algorithms, Microchips, and Network Applications (AMNA 2024), 131711Z (8 June 2024); https://doi.org/10.1117/12.3032013
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Performance modeling

Education and training

Fourier transforms

Matrices

Mathematical optimization

Data modeling

Quantization

Back to Top