FRANÇAIS
A showcase of ÉTS researchers’ publications and other contributions
SEARCH

Serverless on machine learning: A systematic mapping study

Downloads

Downloads per month over past year

Barrak, Amine, Petrillo, Fabio et Jaafar, Fehmi. 2022. « Serverless on machine learning: A systematic mapping study ». IEEE Access, vol. 10. pp. 99337-99352.

[thumbnail of Petrillo-F-2022-25654.pdf]
Preview
PDF
Petrillo-F-2022-25654.pdf - Published Version
Use licence: Creative Commons CC BY.

Download (2MB) | Preview

Abstract

Machine Learning Operations (MLOps) is an approach to managing the entire lifecycle of a machine learning model. It has evolved over the last years and has started attracting many people in research and businesses in the industry. It supports the development of machine learning (ML) pipelines typical in the phases of data collection, data pre-processing, building datasets, model training, hyper-parameters refinement, testing, and deployment to production. This complex pipeline workflow is a tedious process of iterative experimentation. Moreover, cloud computing services provide advanced features for managing ML stages and deploying them efficiently to production. Specifically, serverless computing has been applied in different stages of the machine learning pipeline. However, to the best of our knowledge, it is missing to know the serverless suitability and benefits it can provide to the ML pipeline. In this paper, we provide a systematic mapping study of machine learning systems applied on serverless architecture that include 53 relevant studies. During this study, we focused on (1) exploring the evolution trend and the main venues; (2) determining the researchers’ focus and interest in using serverless on machine learning; (3) discussing solutions that serverless computing provides to machine learning. Our results show that serverless usage is growing, and several venues are interested in the topic. In addition, we found that the most widely used serverless provider is AWS Lambda, where the primary application was used in the deployment of the ML model. Additionally, several challenges were explored, such as reducing cost, resource scalability, and reducing latency. We also discuss the potential challenges of adopting ML on serverless, such as respecting service level agreement, the cold start problem, security, and privacy. Finally, our contribution provides foundations for future research and applications that involve machine learning in serverless computing.

Item Type: Peer reviewed article published in a journal
Professor:
Professor
Petrillo, Fabio
Affiliation: Génie logiciel et des technologies de l'information
Date Deposited: 17 Oct 2022 14:01
Last Modified: 15 Nov 2022 14:04
URI: https://espace2.etsmtl.ca/id/eprint/25654

Actions (login required)

View Item View Item