Cargando…

FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms

The emerging serverless computing has become a captivating paradigm for deploying cloud applications, alleviating developers’ concerns about infrastructure resource management by configuring necessary parameters such as latency and memory constraints. Existing resource configuration solutions for cl...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Ming, Zhang, Jianshan, Lin, Jingfeng, Chen, Zheyi, Zheng, Xianghan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10535806/
https://www.ncbi.nlm.nih.gov/pubmed/37765893
http://dx.doi.org/10.3390/s23187829
_version_ 1785112717111066624
author Li, Ming
Zhang, Jianshan
Lin, Jingfeng
Chen, Zheyi
Zheng, Xianghan
author_facet Li, Ming
Zhang, Jianshan
Lin, Jingfeng
Chen, Zheyi
Zheng, Xianghan
author_sort Li, Ming
collection PubMed
description The emerging serverless computing has become a captivating paradigm for deploying cloud applications, alleviating developers’ concerns about infrastructure resource management by configuring necessary parameters such as latency and memory constraints. Existing resource configuration solutions for cloud-based serverless applications can be broadly classified into modeling based on historical data or a combination of sparse measurements and interpolation/modeling. In pursuit of service response and conserving network bandwidth, platforms have progressively expanded from the traditional cloud to the edge. Compared to cloud platforms, serverless edge platforms often lead to more running overhead due to their limited resources, resulting in undesirable financial costs for developers when using the existing solutions. Meanwhile, it is extremely challenging to handle the heterogeneity of edge platforms, characterized by distinct pricing owing to their varying resource preferences. To tackle these challenges, we propose an adaptive and efficient approach called FireFace, consisting of prediction and decision modules. The prediction module extracts the internal features of all functions within the serverless application and uses this information to predict the execution time of the functions under specific configuration schemes. Based on the prediction module, the decision module analyzes the environment information and uses the Adaptive Particle Swarm Optimization algorithm and Genetic Algorithm Operator (APSO-GA) algorithm to select the most suitable configuration plan for each function, including CPU, memory, and edge platforms. In this way, it is possible to effectively minimize the financial overhead while fulfilling the Service Level Objectives (SLOs). Extensive experimental results show that our prediction model obtains optimal results under all three metrics, and the prediction error rate for real-world serverless applications is in the range of 4.25∼9.51%. Our approach can find the optimal resource configuration scheme for each application, which saves 7.2∼44.8% on average compared to other classic algorithms. Moreover, FireFace exhibits rapid adaptability, efficiently adjusting resource allocation schemes in response to dynamic environments.
format Online
Article
Text
id pubmed-10535806
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-105358062023-09-29 FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms Li, Ming Zhang, Jianshan Lin, Jingfeng Chen, Zheyi Zheng, Xianghan Sensors (Basel) Article The emerging serverless computing has become a captivating paradigm for deploying cloud applications, alleviating developers’ concerns about infrastructure resource management by configuring necessary parameters such as latency and memory constraints. Existing resource configuration solutions for cloud-based serverless applications can be broadly classified into modeling based on historical data or a combination of sparse measurements and interpolation/modeling. In pursuit of service response and conserving network bandwidth, platforms have progressively expanded from the traditional cloud to the edge. Compared to cloud platforms, serverless edge platforms often lead to more running overhead due to their limited resources, resulting in undesirable financial costs for developers when using the existing solutions. Meanwhile, it is extremely challenging to handle the heterogeneity of edge platforms, characterized by distinct pricing owing to their varying resource preferences. To tackle these challenges, we propose an adaptive and efficient approach called FireFace, consisting of prediction and decision modules. The prediction module extracts the internal features of all functions within the serverless application and uses this information to predict the execution time of the functions under specific configuration schemes. Based on the prediction module, the decision module analyzes the environment information and uses the Adaptive Particle Swarm Optimization algorithm and Genetic Algorithm Operator (APSO-GA) algorithm to select the most suitable configuration plan for each function, including CPU, memory, and edge platforms. In this way, it is possible to effectively minimize the financial overhead while fulfilling the Service Level Objectives (SLOs). Extensive experimental results show that our prediction model obtains optimal results under all three metrics, and the prediction error rate for real-world serverless applications is in the range of 4.25∼9.51%. Our approach can find the optimal resource configuration scheme for each application, which saves 7.2∼44.8% on average compared to other classic algorithms. Moreover, FireFace exhibits rapid adaptability, efficiently adjusting resource allocation schemes in response to dynamic environments. MDPI 2023-09-12 /pmc/articles/PMC10535806/ /pubmed/37765893 http://dx.doi.org/10.3390/s23187829 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Li, Ming
Zhang, Jianshan
Lin, Jingfeng
Chen, Zheyi
Zheng, Xianghan
FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title_full FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title_fullStr FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title_full_unstemmed FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title_short FireFace: Leveraging Internal Function Features for Configuration of Functions on Serverless Edge Platforms
title_sort fireface: leveraging internal function features for configuration of functions on serverless edge platforms
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10535806/
https://www.ncbi.nlm.nih.gov/pubmed/37765893
http://dx.doi.org/10.3390/s23187829
work_keys_str_mv AT liming firefaceleveraginginternalfunctionfeaturesforconfigurationoffunctionsonserverlessedgeplatforms
AT zhangjianshan firefaceleveraginginternalfunctionfeaturesforconfigurationoffunctionsonserverlessedgeplatforms
AT linjingfeng firefaceleveraginginternalfunctionfeaturesforconfigurationoffunctionsonserverlessedgeplatforms
AT chenzheyi firefaceleveraginginternalfunctionfeaturesforconfigurationoffunctionsonserverlessedgeplatforms
AT zhengxianghan firefaceleveraginginternalfunctionfeaturesforconfigurationoffunctionsonserverlessedgeplatforms