ENGLISH
La vitrine de diffusion des publications et contributions des chercheurs de l'ÉTS
RECHERCHER

The impact of LoRA adapters on LLMs for clinical text classification under computational and data constraints

Le, Than-Dung, Nguyen, Ti Ti, Nguyen Ha, Vu, Chatzinotas, Symeon, Jouvet, Philippe et Noumeir, Rita. 2025. « The impact of LoRA adapters on LLMs for clinical text classification under computational and data constraints ». IEEE Access, vol. 13. pp. 109365-109377.

[thumbnail of Noumeir-R-2025-31213.pdf]
Prévisualisation
PDF
Noumeir-R-2025-31213.pdf - Version publiée
Licence d'utilisation : Creative Commons CC BY-NC-ND.

Télécharger (2MB) | Prévisualisation

Résumé

Fine-tuning Large Language Models (LLMs) for clinical Natural Language Processing (NLP) poses significant challenges due to domain gap, limited data, and stringent hardware constraints. In this study, we evaluate four adapter techniques—Adapter, Lightweight, TinyAttention, and Gated Residual Network (GRN) - equivalent to Low-Rank Adaptation (LoRA), for clinical note classification under real-world, resource-constrained conditions. All experiments were conducted on a single NVIDIA Quadro P620 GPU (2 GB VRAM, 512 CUDA cores, 1.386 TFLOPS FP32), limiting batch sizes to ≤8 sequences and maximum sequence length to 256 tokens. Our clinical corpus comprises only 580 000 tokens, several orders of magnitude smaller than standard LLM pre-training datasets. We fine-tuned three biomedical pre-trained LLMs (CamemBERT-bio, AliBERT, DrBERT) and two lightweight Transformer models trained from scratch. Results show that 1) adapter structures provide no consistent gains when fine-tuning biomedical LLMs under these constraints, and 2) simpler Transformers, with minimal parameter counts and training times under six hours, outperform adapter-augmented LLMs, which required over 1000 GPU-hours. Among adapters, GRN achieved the best metrics (accuracy, precision, recall, F1 = 0.88). These findings demonstrate that, in low-resource clinical settings with limited data and compute, lightweight Transformers trained from scratch offer a more practical and efficient solution than large LLMs, while GRN remains a viable adapter choice when minimal adaptation is needed.

Type de document: Article publié dans une revue, révisé par les pairs
Professeur:
Professeur
Noumeir, Rita
Affiliation: Génie électrique
Date de dépôt: 30 juill. 2025 13:28
Dernière modification: 12 août 2025 20:05
URI: https://espace2.etsmtl.ca/id/eprint/31213

Actions (Authentification requise)

Dernière vérification avant le dépôt Dernière vérification avant le dépôt