FRANÇAIS
A showcase of ÉTS researchers’ publications and other contributions
SEARCH

The impact of LoRA adapters on LLMs for clinical text classification under computational and data constraints

Le, Than-Dung, Nguyen, Ti Ti, Nguyen Ha, Vu, Chatzinotas, Symeon, Jouvet, Philippe and Noumeir, Rita. 2025. « The impact of LoRA adapters on LLMs for clinical text classification under computational and data constraints ». IEEE Access, vol. 13. pp. 109365-109377.

[thumbnail of Noumeir-R-2025-31213.pdf]
Preview
PDF
Noumeir-R-2025-31213.pdf - Published Version
Use licence: Creative Commons CC BY-NC-ND.

Download (2MB) | Preview

Abstract

Fine-tuning Large Language Models (LLMs) for clinical Natural Language Processing (NLP) poses significant challenges due to domain gap, limited data, and stringent hardware constraints. In this study, we evaluate four adapter techniques—Adapter, Lightweight, TinyAttention, and Gated Residual Network (GRN) - equivalent to Low-Rank Adaptation (LoRA), for clinical note classification under real-world, resource-constrained conditions. All experiments were conducted on a single NVIDIA Quadro P620 GPU (2 GB VRAM, 512 CUDA cores, 1.386 TFLOPS FP32), limiting batch sizes to ≤8 sequences and maximum sequence length to 256 tokens. Our clinical corpus comprises only 580 000 tokens, several orders of magnitude smaller than standard LLM pre-training datasets. We fine-tuned three biomedical pre-trained LLMs (CamemBERT-bio, AliBERT, DrBERT) and two lightweight Transformer models trained from scratch. Results show that 1) adapter structures provide no consistent gains when fine-tuning biomedical LLMs under these constraints, and 2) simpler Transformers, with minimal parameter counts and training times under six hours, outperform adapter-augmented LLMs, which required over 1000 GPU-hours. Among adapters, GRN achieved the best metrics (accuracy, precision, recall, F1 = 0.88). These findings demonstrate that, in low-resource clinical settings with limited data and compute, lightweight Transformers trained from scratch offer a more practical and efficient solution than large LLMs, while GRN remains a viable adapter choice when minimal adaptation is needed.

Item Type: Peer reviewed article published in a journal
Professor:
Professor
Noumeir, Rita
Affiliation: Génie électrique
Date Deposited: 30 Jul 2025 13:28
Last Modified: 12 Aug 2025 20:05
URI: https://espace2.etsmtl.ca/id/eprint/31213

Actions (login required)

View Item View Item