ENGLISH
La vitrine de diffusion des publications et contributions des chercheurs de l'ÉTS
RECHERCHER

Serverless on machine learning: A systematic mapping study

Téléchargements

Téléchargements par mois depuis la dernière année

Plus de statistiques...

Barrak, Amine, Petrillo, Fabio et Jaafar, Fehmi. 2022. « Serverless on machine learning: A systematic mapping study ». IEEE Access, vol. 10. pp. 99337-99352.
Compte des citations dans Scopus : 8.

[thumbnail of Petrillo-F-2022-25654.pdf]
Prévisualisation
PDF
Petrillo-F-2022-25654.pdf - Version publiée
Licence d'utilisation : Creative Commons CC BY.

Télécharger (2MB) | Prévisualisation

Résumé

Machine Learning Operations (MLOps) is an approach to managing the entire lifecycle of a machine learning model. It has evolved over the last years and has started attracting many people in research and businesses in the industry. It supports the development of machine learning (ML) pipelines typical in the phases of data collection, data pre-processing, building datasets, model training, hyper-parameters refinement, testing, and deployment to production. This complex pipeline workflow is a tedious process of iterative experimentation. Moreover, cloud computing services provide advanced features for managing ML stages and deploying them efficiently to production. Specifically, serverless computing has been applied in different stages of the machine learning pipeline. However, to the best of our knowledge, it is missing to know the serverless suitability and benefits it can provide to the ML pipeline. In this paper, we provide a systematic mapping study of machine learning systems applied on serverless architecture that include 53 relevant studies. During this study, we focused on (1) exploring the evolution trend and the main venues; (2) determining the researchers’ focus and interest in using serverless on machine learning; (3) discussing solutions that serverless computing provides to machine learning. Our results show that serverless usage is growing, and several venues are interested in the topic. In addition, we found that the most widely used serverless provider is AWS Lambda, where the primary application was used in the deployment of the ML model. Additionally, several challenges were explored, such as reducing cost, resource scalability, and reducing latency. We also discuss the potential challenges of adopting ML on serverless, such as respecting service level agreement, the cold start problem, security, and privacy. Finally, our contribution provides foundations for future research and applications that involve machine learning in serverless computing.

Type de document: Article publié dans une revue, révisé par les pairs
Professeur:
Professeur
Petrillo, Fabio
Affiliation: Génie logiciel et des technologies de l'information
Date de dépôt: 17 oct. 2022 14:01
Dernière modification: 15 nov. 2022 14:04
URI: https://espace2.etsmtl.ca/id/eprint/25654

Actions (Authentification requise)

Dernière vérification avant le dépôt Dernière vérification avant le dépôt