Serverless computing

Last updated

Serverless computing is a cloud computing execution model in which the cloud provider allocates machine resources on demand, taking care of the servers on behalf of their customers. Serverless is a misnomer in the sense that servers are still used by cloud service providers to execute code for developers. However, developers of serverless applications are not concerned with capacity planning, configuration, management, maintenance, fault tolerance, or scaling of containers, virtual machines, or physical servers. When an app is not in use, there are no computing resources allocated to the app. Pricing is based on the actual amount of resources consumed by an application. [1] It can be a form of utility computing.

Contents

One proposed definition for serverless computing that encompasses these ideas is that serverless computing is a "cloud computing paradigm encompassing a class of cloud computing platforms that allow one to develop, deploy, and run applications (or components thereof) in the cloud without allocating and managing virtualized servers and resources or being concerned about other operational aspects. The responsibility for operational aspects, such as fault tolerance or the elastic scaling of computing, storage, and communication resources to match varying application demands, is offloaded to the cloud provider. Providers apply utilization-based billing: they charge cloud users with fine granularity, in proportion to the resources that applications actually consume from the cloud infrastructure, such as computing time, memory, and storage space." [2] Note the definition of serverless has stretched over time. According to Ben Kehoe, serverless is a spectrum; one should not fixate on a strict definition of serverless nor any specific serverless technology. Instead, one should focus on serverless mindset: how to use serverless to solve one's business problems. [3]

Serverless computing can simplify the process of deploying code into production. It does not entirely remove the complexity, but mainly shifts it from the operations team to development team. And the more fine grained the application, the harder it is to manage it.[ clarification needed ] [4]

Serverless code can be used in conjunction with code deployed in traditional styles, such as microservices or monoliths. Alternatively, applications can be written to be purely serverless and use no provisioned servers at all. [5] This should not be confused with computing or networking models that do not require an actual server to function, such as peer-to-peer (P2P).

According to Yan Cui, serverless should be adopted only when it helps to deliver customer value faster. And while adopting, organizations should take small steps and de-risk along the way. [6]

Serverless runtimes

Serverless vendors offer compute runtimes that execute application logic but do not store data. Common runtime models are function as a service (FaaS) and container as a service. Common languages supported by serverless runtimes are Java, Python, and PHP. Generally, the functions run within isolation boundaries, such as Linux containers.

Commercial offerings

The first pay-as-you-go code execution platform was Zimki, released in 2006, but it was not commercially successful. [7] In 2008, Google released Google App Engine, which featured metered billing for applications that used a custom Python framework, but could not execute arbitrary code. [8] PiCloud, released in 2010, offered FaaS support for Python.

Google App Engine, introduced in 2008, was the first abstract serverless computing offering. [9] App Engine included HTTP functions with a 60-second timeout and a blob store and data store with their own timeouts. No in-memory persistence was allowed. All operations had to be executed within these limits, but this allowed apps built in App Engine to scale near-infinitely and was used to support early customers including Snapchat, as well as many external and internal Google apps. Language support was limited to Python using native Python modules, as well as a limited selection of Python modules in C that were chosen by Google. Like later serverless platforms, App Engine also used pay-for-what-you-use billing. [10]

AWS Lambda, introduced by Amazon in 2014, [11] popularized the abstract serverless computing model. It is supported by a number of additional AWS serverless tools such as AWS Serverless Application Model (AWS SAM) Amazon CloudWatch, and others.

Google Cloud Platform created a second serverless offering, Google Cloud Functions, in 2016. [12]

Oracle Cloud Functions is a serverless platform offered on Oracle Cloud Infrastructure, and is based on the open-source Fn Project so developers can create applications that can be ported to other cloud and on-premise environments. It supports code in Python, Go, Java, Ruby, and Node. [13]

Serverless databases

Several serverless databases have emerged to extend the serverless execution model to the RDBMS, eliminating the need to provision or scale virtualized or physical database hardware.

Nutanix offers a solution named Era which turns an existing RDBMS such as Oracle, MariaDB, PostgreSQL, or Microsoft SQL Server into a serverless service. [14]

Amazon Aurora offers a serverless version of its databases, based on MySQL and PostgreSQL, providing on-demand, auto-scaling configurations. [15]

Azure Data Lake is a highly scalable data storage and analytics service. The service is hosted in Azure, Microsoft's public cloud. Azure Data Lake Analytics provides a distributed infrastructure that can dynamically allocate or de-allocate resources so customers pay for only the services they use.

Oracle Cloud offers a serverless version of its Oracle Autonomous Database, which is the Autonomous Transaction Processing service. The serverless service also includes a JSON edition. [16]

Firebase, also owned by Google, [17] includes a hierarchical database and is available via fixed and pay-as-you-go plans. [18]

Advantages

Cost

Serverless can be more cost-effective than renting or purchasing a fixed quantity of servers, [19] which generally involves significant periods of underusage or idle time. [1] It can even be more cost-efficient than provisioning an autoscaling group, due to more efficient bin-packing of the underlying machine resources.

This can be described as pay-as-you-go computing [19] or bare-code, [19] as one is charged based solely upon the time and memory allocated to run ones code, without associated fees for idle time. [19] A useful analogy here is between rental car (traditional cloud Virtual Machines) versus ride share apps like Uber or Lyft (serverless computing). Immediate cost benefits are related to the lack of operating costs, including: licenses, installation, dependencies, and personnel cost for maintenance, support, or patching. [19] Due to infinite scalability, developers may experience bill shock as a result of faulty code or a Denial-of-service attack. This is however often refunded, at the expense of the service provider. [20]

Elasticity versus scalability

In addition, a serverless architecture means that developers and operators do not need to spend time setting up and tuning autoscaling policies or systems; the cloud provider is responsible for scaling the capacity to the demand. [1] [21] [19] As Google puts it: "from prototype to production to planet-scale." [19]

As cloud native systems inherently scale down as well as up, these systems are known as elastic rather than scalable.

Small teams of developers are able to run code themselves without the dependence upon teams of infrastructure and support engineers; more developers are becoming DevOps-skilled and distinctions between being a software developer or hardware engineer are blurring. [19]

Productivity

With function as a service, the units of code exposed to the outside world are simple event-driven functions. This means that typically, the programmer does not have to worry about multithreading or directly handling HTTP requests in their code, simplifying the task of back-end software development.

Disadvantages

Serverless applications are prone to fallacies of distributed computing. In addition, they are prone to following fallacies: [22] [23]

Performance

Infrequently-used serverless code may suffer from greater response latency than code that is continuously running on a dedicated server, virtual machine, or container. This is because, unlike with autoscaling, the cloud provider typically spins down the serverless code completely when not in use. This means that if the runtime (for example, the Java runtime) requires a significant amount of time to start up, it will create additional latency. [24] This is referred to as cold start in serverless computing.

Resource limits

Serverless computing is not suited to some computing workloads, such as high-performance computing, because of the resource limits imposed by cloud providers, and also because it would likely be cheaper to bulk-provision the number of servers believed to be required at any given point in time. [25] This makes it challenging to deploy complex applications (such as those with a directed acyclic graph of functions); serverless computing out of the box is most suited for execution of individual stateless functions. Some commercial offerings like AWS Step Functions from Amazon and Azure Durable Functions from Microsoft are meant to ease this challenge.

Monitoring and debugging

Diagnosing performance or excessive resource usage problems with serverless code may be more difficult than with traditional server code, because although entire functions can be timed, [5] there is typically no ability to dig into more detail by attaching profilers, debuggers, or APM tools. [26] Furthermore, the environment in which the code runs is typically not open source, so its performance characteristics cannot be precisely replicated in a local environment.

Security

According to OWASP, serverless applications are vulnerable to variations of traditional attacks, insecure code, and some serverless-specific attacks (like Denial of Wallet [27] ). So, the risks have changed and attack prevention requires a shift in mindset. [28] [29]

Serverless is sometimes mistakenly considered as more secure than traditional architectures. While this is true to some extent because OS vulnerabilities are taken care of by the cloud provider, the total attack surface is significantly larger as there are many more components to the application compared to traditional architectures, and each component is an entry point to the serverless application. Moreover, the security solutions that customers used to have to protect their cloud workloads become irrelevant as customers cannot control and install anything on the endpoint and network level such as an intrusion detection/prevention system (IDS/IPS). [30]

This is intensified by the mono-culture properties of the entire server network. (A single flaw can be applied globally.) According to Protego, the "solution to secure serverless apps is close partnership between developers, DevOps, and AppSec, also known as DevSecOps. Find the balance where developers don't own security, but they aren't absolved from responsibility either. Take steps to make it everyone's problem. Create cross-functional teams and work towards tight integration between security specialists and development teams. Collaborate so your organization can resolve security risks at the speed of serverless." [31]

Privacy

Many serverless function environments are based on proprietary public cloud environments. Here, some privacy implications have to be considered, such as shared resources and access by external employees. However, serverless computing can also be done on private cloud environment or even on-premises, using for example the Kubernetes platform. This gives companies full control over privacy mechanisms, just as with hosting in traditional server setups.

Standards

Serverless computing is covered by International Data Center Authority (IDCA) in their Framework AE360. [32] However, the part related to portability can be an issue when moving business logic from one public cloud to another, for which the Docker solution was created. Cloud Native Computing Foundation (CNCF) is also working on developing a specification with Oracle. [33]

Vendor lock-in

Serverless computing is provided as a third-party service. Applications and software that run in the serverless environment are by default locked to a specific cloud vendor. This issue is exacerbated in serverless computing, as with its increased level of abstraction, public vendors only allow customers to upload code to a FaaS platform without the authority to configure underlying environments. More importantly, when considering a more complex workflow that includes Backend-as-a-Service (BaaS), a BaaS offering can typically only natively trigger a FaaS offering from the same provider. This makes the workload migration in serverless computing virtually impossible. Therefore, considering how to design and deploy serverless workflows from a multi-cloud perspective seems promising and is starting to prevail[ when? ]. [34] [35] [36]

Best practices

Following DevSecOps practices can help one to use and to secure serverless technologies more effectively. [37] In serverless applications, the line between the infrastructure and business logic is blurred and the apps are usually spread across various services. According to Yan Cui, to get the most value from testing efforts, serverless applications should to be tested mainly for their integrations, and arguably, unit tests should be used only if there is a complex business logic. Also, to make debugging and implementation of serverless applications easier, developers should use orchestration within the bounded context of a microservice, and should use choreography between the bounded-contexts. [6]

According to Yan Cui, ephemeral resources should be kept together to achieve a high cohesion. However, shared resources that have a long spin-up time (e.g. AWS RDS cluster) and landing zone should have their own separate repository, deployment pipeline and stack. [6]

Uses/functions

Serverless functions can be used for: [38]

See also

Related Research Articles

In computing, a solution stack or software stack is a set of software subsystems or components needed to create a complete platform such that no additional software is needed to support applications. Applications are said to "run on" or "run on top of" the resulting platform.

<span class="mw-page-title-main">Google App Engine</span> Cloud-based web application hosting service

Google App Engine is a cloud computing platform used as a service for developing and hosting web applications. Applications are sandboxed and run across multiple Google-managed servers. GAE supports automatic scaling for web applications, allocating more resources to the web application as the amount of requests increases. It was released as a preview in April 2008 and launched officially in September 2011.

Platform as a service (PaaS) or application platform as a service (aPaaS) or platform-based service is a cloud computing service model where users provision, instantiate, run and manage a modular bundle of a computing platform and applications, without the complexity of building and maintaining the infrastructure associated with developing and launching application(s), and to allow developers to create, develop, and package such software bundles.

<span class="mw-page-title-main">Cloud computing</span> Form of shared internet-based computing

Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. Large clouds often have functions distributed over multiple locations, each of which is a data center. Cloud computing relies on sharing of resources to achieve coherence and typically uses a pay-as-you-go model, which can help in reducing capital expenses but may also lead to unexpected operating expenses for users.

Eucalyptus is a paid and open-source computer software for building Amazon Web Services (AWS)-compatible private and hybrid cloud computing environments, originally developed by the company Eucalyptus Systems. Eucalyptus is an acronym for Elastic Utility Computing Architecture for Linking Your Programs To Useful Systems. Eucalyptus enables pooling compute, storage, and network resources that can be dynamically scaled up or down as application workloads change. Mårten Mickos was the CEO of Eucalyptus. In September 2014, Eucalyptus was acquired by Hewlett-Packard and then maintained by DXC Technology. After DXC stopped developing the product in late 2017, AppScale Systems forked the code and started supporting Eucalyptus customers.

<span class="mw-page-title-main">Virtual private cloud</span> Pool of shared resources allocated within a public cloud environment

A virtual private cloud (VPC) is an on-demand configurable pool of shared resources allocated within a public cloud environment, providing a certain level of isolation between the different organizations using the resources. The isolation between one VPC user and all other users of the same cloud is achieved normally through allocation of a private IP subnet and a virtual communication construct per user. In a VPC, the previously described mechanism, providing isolation within the cloud, is accompanied with a virtual private network (VPN) function that secures, by means of authentication and encryption, the remote access of the organization to its VPC resources. With the introduction of the described isolation levels, an organization using this service is in effect working on a 'virtually private' cloud, and hence the name VPC.

<span class="mw-page-title-main">AppScale</span> American cloud infrastructure software company

AppScale is a software company that offers cloud infrastructure software and services to enterprises, government agencies, contractors, and third-party service providers. The company commercially supports one software product, AppScale ATS, a managed hybrid cloud infrastructure software platform that emulates the core AWS APIs. In 2019, the company ended commercial support for its open-source serverless computing platform AppScale GTS, but AppScale GTS source code remains freely available to the open-source community.

A cloud database is a database that typically runs on a cloud computing platform and access to the database is provided as-a-service. There are two common deployment models: users can run databases on the cloud independently, using a virtual machine image, or they can purchase access to a database service, maintained by a cloud database provider. Of the databases available on the cloud, some are SQL-based and some use a NoSQL data model.

<span class="mw-page-title-main">OpenShift</span> Cloud computing software

OpenShift is a family of containerization software products developed by Red Hat. Its flagship product is the OpenShift Container Platform — a hybrid cloud platform as a service built around Linux containers orchestrated and managed by Kubernetes on a foundation of Red Hat Enterprise Linux. The family's other products provide this platform through different environments: OKD serves as the community-driven upstream, Several deployment methods are available including self-managed, cloud native under ROSA, ARO and RHOIC on AWS, Azure, and IBM Cloud respectively, OpenShift Online as software as a service, and OpenShift Dedicated as a managed service.

Backend as a service (BaaS), sometimes also referred to as mobile backend as a service (MBaaS), is a service for providing web app and mobile app developers with a way to easily build a backend to their frontend applications. Features available include user management, push notifications, and integration with social networking services. These services are provided via the use of custom software development kits (SDKs) and application programming interfaces (APIs). BaaS is a relatively recent development in cloud computing, with most BaaS startups dating from 2011 or later. Some of the most popular service providers are AWS Amplify and Firebase.

Google Cloud Platform (GCP) is a suite of cloud computing services offered by Google that provides a series of modular cloud services including computing, data storage, data analytics, and machine learning, alongside a set of management tools. It runs on the same infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail, and Google Docs, according to Verma et al. Registration requires a credit card or bank account details.

Autoscaling, also spelled auto scaling or auto-scaling, and sometimes also called automatic scaling, is a method used in cloud computing that dynamically adjusts the amount of computational resources in a server farm - typically measured by the number of active servers - automatically based on the load on the farm. For example, the number of servers running behind a web application may be increased or decreased automatically based on the number of active users on the site. Since such metrics may change dramatically throughout the course of the day, and servers are a limited resource that cost money to run even while idle, there is often an incentive to run "just enough" servers to support the current load while still being able to support sudden and large spikes in activity. Autoscaling is helpful for such needs, as it can reduce the number of active servers when activity is low, and launch new servers when activity is high. Autoscaling is closely related to, and builds upon, the idea of load balancing.

<span class="mw-page-title-main">AWS Lambda</span> Serverless computing platform

AWS Lambda is an event-driven, serverless Function as a Service (FaaS) provided by Amazon as a part of Amazon Web Services. It is designed to enable developers to run code without provisioning or managing servers. It executes code in response to events and automatically manages the computing resources required by that code. It was introduced on November 13, 2014.

The Serverless Framework is a web framework written using Node.js. Serverless is the first framework developed for building applications on AWS Lambda, a serverless computing platform provided by Amazon as a part of Amazon Web Services. Currently, applications developed with Serverless can be deployed to other function as a service providers, including Microsoft Azure with Azure Functions, IBM Bluemix with IBM Cloud Functions based on Apache OpenWhisk, Google Cloud using Google Cloud Functions, Oracle Cloud using Oracle Fn, Kubeless based on Kubernetes, Spotinst and Webtask by Auth0.

Function as a service (FaaS) is a category of cloud computing services that provides a platform allowing customers to develop, run, and manage application functionalities without the complexity of building and maintaining the infrastructure typically associated with developing and launching an app. Building an application following this model is one way of achieving a "serverless" architecture, and is typically used when building microservices applications.

Apache MXNet is an open-source deep learning software framework that trains and deploys deep neural networks. It aims to be scalable, allows fast model training, and supports a flexible programming model and multiple programming languages. The MXNet library is portable and can scale to multiple GPUs and machines. It was co-developed by Carlos Guestrin at the University of Washington, along with GraphLab.

<span class="mw-page-title-main">Netlify</span> American cloud computing company

Netlify is a remote-first cloud computing company that offers a development platform that includes build, deploy, and serverless backend services for web applications and dynamic websites.

<span class="mw-page-title-main">IBM Cloud</span> Cloud computing services provided by IBM

IBM Cloud is a set of cloud computing services for business offered by the information technology company IBM.

AWS Glue is an event-driven, serverless computing platform provided by Amazon as a part of Amazon Web Services. It was introduced in August 2017.

AWS App Runner is a fully managed container application service offered by Amazon Web Services (AWS). Launched in May 2021, it is designed to simplify the process of building, deploying, and scaling containerized applications for developers. The service enables users to focus on writing code and developing features, without needing to manage the underlying infrastructure. It provides automatic scaling, load balancing, and security features, making it a suitable choice for deploying web applications and APIs. The service also simplifies MLOps.

References

  1. 1 2 3 Miller, Ron (24 Nov 2015). "AWS Lambda Makes Serverless Applications A Reality". TechCrunch . Retrieved 10 July 2016.
  2. Kounev, Samuel (23 August 2023). "Serverless Computing: What It Is, and What It Is Not?". Communications of ACM . Retrieved 4 April 2024.
  3. Serverless as a Game Changer How to Get the Most Out of the Cloud. 2023. ISBN   9780137392551.
  4. The Software Architect Elevator: Redefining the Architect's Role in the Digital Enterprise. O'Reilly Media. 2020. ISBN   978-1492077541.
  5. 1 2 MSV, Janakiram (16 July 2015). "PaaS Vendors, Watch Out! Amazon Is All Set To Disrupt the Market". Forbes . Retrieved 10 July 2016.
  6. 1 2 3 Cui, Yan (2020). Serverless Architectures on AWS (2nd ed.). Manning. ISBN   978-1617295423.
  7. Williams, Christopher. "Fotango to smother Zimki on Christmas Eve". The Register . Retrieved 2017-06-11.
  8. "Python Runtime Environment | App Engine standard environment for Python | Google Cloud Platform". Google Cloud Platform. Retrieved 2017-06-11.
  9. Evans, Jon (11 April 2015). "Whatever Happened to PaaS?". TechCrunch. Retrieved 17 December 2020.
  10. Kincaid, Jason (25 February 2009). "Google App Engine Offers Pricing Plan Beyond Quotas; Grab A Free I/O Ticket To Celebrate". TechCrunch. Retrieved 17 December 2020.
  11. Miller, Ron (13 Nov 2014). "Amazon Launches Lambda, An Event-Driven Compute Service". TechCrunch . Retrieved 10 July 2016.
  12. Novet, Jordan (9 February 2016). "Google has quietly launched its answer to AWS Lambda". VentureBeat . Retrieved 10 July 2016.
  13. "How to choose a cloud serverless platform". www.arnnet.com.au. 3 March 2021. Retrieved 2022-03-23.
  14. "One-click Database Administration & Automation | Nutanix Era".
  15. "Amazon Aurora Serverless - On-demand, Auto-scaling Relational Database - AWS". Amazon Web Services, Inc. Retrieved 2019-08-08.
  16. "Oracle brings the Autonomous Database to JSON". ZDNet. Retrieved 2022-03-23.
  17. Lardinois, Frederic (21 October 2014). "Google Acquires Firebase To Help Developers Build Better Real-Time Apps | TechCrunch" . Retrieved 2017-06-11.
  18. Darrow, Barb (2013-06-20). "Firebase gets $5.6M to launch its paid product and fire up its base". gigaom.com. Retrieved 2017-06-11.
  19. 1 2 3 4 5 6 7 8 Jamieson, Frazer (4 September 2017). "Losing the server? Everybody is talking about serverless architecture".
  20. Anderson, Tim (2020-12-10). "Google Cloud (over)Run: How a free trial experiment ended with a $72,000 bill overnight". The Register. Retrieved 2024-10-09.
  21. Miller, Ron (31 March 2016). "Microsoft answers AWS Lambda's event-triggered serverless apps with Azure Functions". TechCrunch . Retrieved 10 July 2016.
  22. Richards, Mark (March 3, 2020). Fundamentals of Software Architecture: An Engineering Approach (1st ed.). O'Reilly Media. ISBN   978-1492043454.
  23. Richards, Mark (2021). Software Architecture: The Hard Parts: Modern Trade-Off Analyses for Distributed Architectures (1st ed.). O'Reilly Media. ISBN   978-1492086895.
  24. van Eyk, Erwin; Iosup, Alexandru; Abad, Cristina L.; Grohmann, Johannes; Eismann, Simon (2018). "A SPEC RG Cloud Group's Vision on the Performance Challenges of FaaS Cloud Architectures" (PDF). Companion of the 2018 ACM/SPEC International Conference on Performance Engineering. pp. 21–24. doi:10.1145/3185768.3186308. hdl: 1871.1/8aa529e9-f8f9-4305-8073-91dd1a9451fb . ISBN   9781450356299. S2CID   4718290.
  25. Hellerstein, Joseph; Faleiro, Jose; Gonzalez, Joseph; Schleier-Smith, Johann; Screekanti, Vikram; Tumanov, Alexey; Wu, Chenggang (2019). "Serverless Computing: One Step Forward, Two Steps Back". arXiv: 1812.03651 .{{cite journal}}: Cite journal requires |journal= (help)
  26. Leitner, Philipp; Wittern, Erik; Spillner, Josef; Hummer, Waldemar (2019). "A mixed-method empirical study of Function-as-a-Service software development in industrial practice". Journal of Systems and Software. 149: 340–359. doi:10.1016/j.jss.2018.12.013. hdl: 11475/14313 . ISSN   0164-1212. S2CID   67775784.
  27. Kelly, Daniel; Glavin, Frank G.; Barrett, Enda (2021-08-01). "Denial of wallet—Defining a looming threat to serverless computing". Journal of Information Security and Applications. 60: 102843. doi:10.1016/j.jisa.2021.102843. ISSN   2214-2126.
  28. "OWASP Serverless Top 10 | OWASP Foundation". owasp.org. Retrieved 2024-05-20.
  29. OWASP/Serverless-Top-10-Project, OWASP, 2024-05-02, retrieved 2024-05-20
  30. "Cloud Workload Protection (CWP) | CWPP".
  31. Solow, Hillel (2019-02-05). "Serverless Computing Security Risks & Challenges". protego.io. Retrieved 2019-03-20.
  32. "The Standards Framework for the Application Ecosystem | International Data Center Authority (IDCA)".
  33. "CNCF, Oracle Boost Serverless Standardization Efforts". SDxCentral. Retrieved 2018-11-24.
  34. Aske, Austin; Zhao, Xinghui (2018-08-13). "Supporting Multi-Provider Serverless Computing on the Edge". Proceedings of the 47th International Conference on Parallel Processing Companion. ICPP Workshops '18. New York, NY, USA: Association for Computing Machinery. pp. 1–6. doi:10.1145/3229710.3229742. ISBN   978-1-4503-6523-9. S2CID   195348799.
  35. Baarzi, Ataollah Fatahi; Kesidis, George; Joe-Wong, Carlee; Shahrad, Mohammad (2021-11-01). "On Merits and Viability of Multi-Cloud Serverless". Proceedings of the ACM Symposium on Cloud Computing. SoCC '21. New York, NY, USA: Association for Computing Machinery. pp. 600–608. doi:10.1145/3472883.3487002. ISBN   978-1-4503-8638-8. S2CID   239890130.
  36. Zhao, Haidong; Benomar, Zakaria; Pfandzelter, Tobias; Georgantas, Nikolaos (2022-12-06). "Supporting Multi-Cloud in Serverless Computing". 2022 IEEE/ACM 15th International Conference on Utility and Cloud Computing (UCC). pp. 285–290. arXiv: 2209.09367 . doi:10.1109/UCC56403.2022.00051. ISBN   978-1-6654-6087-3. S2CID   252383217.
  37. Katzer, Jason (2020). Learning Serverless: Design, Develop, and Deploy with Confidence. O'Reilly Media. ISBN   978-1492057017.
  38. "What Is Serverless Computing?". ITPro Today. 2021-12-13. Retrieved 2022-03-23.

Further reading

  1. Jonas, Eric (February 2019). "Cloud Programming Simplified: A Berkeley View on Serverless Computing". pp. 1–33. arXiv: 1902.03383 [cs.OS].