To meet user needs in different scenarios, Function Compute provides four types of functions: event functions, web functions, task functions, and GPU functions. This topic describes the applicable scenarios and differences of the function types supported by Function Compute to help you make a technology selection.
Overview of selection
When using Function Compute, you can choose the appropriate function type and runtime environment based on your business scenario and technology stack preference.
For web applications and API services, you can use web functions combined with Custom Runtime. This function supports various popular web application frameworks and can be accessed through a browser or directly invoked by a URL.
For scenarios such as file processing and data stream processing, it is recommended to use event functions combined with Built-in Runtime. You can configure event triggers and integrate various Alibaba Cloud products such as Object Storage Service, ApsaraMQ for RocketMQ, and Simple Log Service.
For model inference scenarios such as chatbots and text-to-image, you can use GPU functions combined with Custom Images. Based on container images of popular AI projects such as ComfyUI, RAG, and TensorRT, you can quickly build AI model inference services.
For scenarios of scheduled tasks and audio and video transcoding such as asynchronous tasks, it is recommended to use task functions combined with Built-in Runtime.
For detailed information about function types and runtime environments, see the table below.
Built-in Runtime and Custom Runtime are both deployed to functions in the form of code packages. If containerized deployment is required, you can also choose Custom Images as the runtime environment.
GPU Functions only support using Custom Images as the runtime environment.
Analysis of selection
Function type selection
Comparison item | Event Function | Web Function | Task Function | GPU Function |
Feature | Used for processing files and data streams. It can be triggered by events from various cloud products such as OSS triggers, Kafka triggers, and SLS triggers. | Supports popular web application frameworks and can be accessed through a browser or invoked by using a URL. | Used for processing asynchronous requests and can track and save the states of an asynchronous invocation in each phase. | Supports container images of popular AI projects such as Stable Diffusion WebUI, ComfyUI, RAG, and TensorRT to quickly build AI model inference services. |
Applicable scenarios |
|
|
|
|
Runtime environment | Recommended to use built-in runtime | Recommended to use custom runtime | Recommended to use built-in runtime | Only supports custom images |
Disabled by default | Disabled by default | Enabled by default | Disabled by default |
If you need to enable asynchronous tasks for an existing function, you can follow the steps in Manage tasks.
Function runtime environment selection
Comparison item | Built-in runtime | Custom runtime | Custom image |
Development workflow | Write handlers based on the interfaces defined by Function Compute. | Develop web applications based on framework templates and observe the results in real time through a public endpoint. | Upload custom images to Alibaba Cloud Container Registry, or use images that are already available in Container Registry. |
Supported instance types | CPU instances | CPU instances | CPU instances and GPU-accelerated instances |
Not supported | Supported | Supported | |
Shortest. The runtime is not included in the code package, resulting in the shortest cold start. | Short. The code package is an HTTP server, which is relatively large but does not require pulling images, resulting in a short cold start. | Relatively long. Requires pulling images, resulting in a relatively long cold start. | |
Code deliverable format | ZIP, JAR (Java), and folder | Container image | |
Some regions (such as Hangzhou) have a maximum of 500 MB, while other regions have a maximum of 100 MB. Note You can configure layers to add dependencies and reduce the size of the code package. |
Note For AI inference applications, you can store large models in NAS or OSS to reduce the image size. | ||
Supported programming languages | Node.js, Python, PHP, Java, C#, Go | No limits | No limits |
Create functions through the console
Event function
If you want to invoke associated functions through event triggers such as OSS triggers, Kafka triggers, and SLS triggers, it is recommended to create event functions and use the built-in runtime as the Runtime Environment.
Web function
If you want to write programs based on popular frameworks in various languages such as Java SpringBoot, Node.js Express, Python Flask, and Golang Gin, it is recommended to create web functions and use the custom runtime as the Runtime Environment.
Task function
If you want to initiate asynchronous invocations for functions and need to track and save the states of each phase of the asynchronous invocation, it is recommended to create task functions and use the built-in runtime as the Runtime Environment.
GPU function
If you want to create GPU instances using container images of popular AI projects such as Stable Diffusion WebUI, ComfyUI, RAG, and TensorRT, it is recommended to create GPU functions. This function only supports the Custom Image runtime environment.