Liu, Xu, Ounifi, Hibat Allah, Gherbi, Abdelouahe, Lemieux, Yves et Li, Wubin.
2018.
« A hybrid GPU-FPGA-based computing platform for machine learning ».
In The 9th International Conference on Emerging Ubiquitous Systems and Pervasive Networks (EUSPN-2018) / The 8th International Conference on Current and Future Trends of Information and Communication Technologies in Healthcare (ICTH-2018) (Leuven, Belgium, Nov. 5-8, 2018)
Coll. « Procedia Computer Science », vol. 141.
pp. 104-111.
Elsevier B.V..
Compte des citations dans Scopus : 18.
Prévisualisation |
PDF
Gherbi A 2018 17955.pdf - Version publiée Licence d'utilisation : Creative Commons CC BY-NC-ND. Télécharger (1MB) | Prévisualisation |
Résumé
We present a hybrid GPU-FPGA based computing platform to tackle the high-density computing problem of machine learning. In our platform, the training part of a machine learning application is implemented on GPU and the inferencing part is implemented on FPGA. It should also include a model transplantation part which can transplant the model from the training part to the inferencing part. For evaluating this design methodology, we selected the LeNet-5 as our benchmark algorithm. During the training phase, GPU TitanXp’s speed was about 8.8x faster than CPU E-1620 and in the inferencing phase, FPGA Arria-10’s inferencing speed was fastest, 44.4x faster than CPU E-1620 and 6341x faster than GPU TitanXp. Moreover, by adopting our design methodology, we improved our LeNet-5 machine learning model’s accuracy from 99.05% to 99.13%, and successfully preserved the accuracy (99.13%) when transplanting the model from the GPU platform to the FPGA platform.
Type de document: | Compte rendu de conférence |
---|---|
ISBN: | 18770509 |
Professeur: | Professeur Gherbi, Abdelouahed |
Affiliation: | Génie logiciel et des technologies de l'information |
Date de dépôt: | 22 janv. 2019 15:25 |
Dernière modification: | 12 juill. 2019 20:14 |
URI: | https://espace2.etsmtl.ca/id/eprint/17955 |
Actions (Authentification requise)
Dernière vérification avant le dépôt |