实验室论文被IEEE Transactions on Parallel and Distributed Systems录用

发布者:邓玉辉发布时间:2023-10-06浏览次数:104

实验室博士生吴朝锐,邓玉辉老师等人联合撰写的论文《HashCache: Accelerating Serverless Computing by Skipping Duplicated Function Execution》被计算机系统结构领域的权威国际学术期刊《IEEE Transactions on Parallel and Distributed Systems》录用。IEEE TPDS为CCF A类期刊。论文将于2024年正式发表。论文摘要如下:


Abstract—Serverless computing is a leading force behind deploying and managing software in cloud computing. Cold starts - an innate problem with the serverless model - gives rise to significant surges in latency in serverless computing. We start tackling this issue by examining the execution of function invocations of serverless applications, where our findings reveal that there exist countless duplicate invocations. Motivated by this profound observation, we propose HashCache that caches duplicate function invocations to eliminate cold starts. In HashCache, serverless functions are classified into three categories, namely, computational function, stateful functions, and environment-related functions. On the grounds of such a function classification, HashCache associates the stateful functions and their states to build an adaptive synchronization mechanism. With this support, HashCache exploits the cached results of computational and stateful functions to serve upcoming invocation requests to the same functions, thereby eliminating cold starts. Moreover, HashCache stores remote files probed by stateful functions into a local cache layer, which further curtails invocation latency. We implement HashCache within the Apache OpenWhisk to forge a cache-enabled serverless computing platform. We conduct extensive experiments to quantitatively evaluate the performance of HashCache in terms of invocation latency and resource utilization. We compare HashCache against two state-of-the-art approaches - FaaSCache and OpenWhisk. The experimental results unveil that our HashCache remarkably reduces invocation latency and resource overhead. More specifically, HashCache curbs the 99-tail latency of FaaSCache and OpenWhisk by up to 91.37% and 95.96% in real-world serverless applications. HashCache also slashes the resource utilization of FaaSCache and OpenWhisk by up to 31.62% and 35.51%, respectively.