Getting Started with Cloud Run HTTP Functions: A Complete Walkthrough


Inception
Hello everyone, this article is part of The GCP Series, and it's not dependent on any previous articles. I use this series to publish-out GCP + Terraform Projects & Knowledge.
Overview
Cloud Run functions has a simple and intuitive developer experience. Just write your code and let Google Cloud handle the operational infrastructure. Develop faster by writing and running small code snippets that respond to events. Streamline challenging orchestration problems by connecting Google Cloud products to one another or third-party services using events.[1]
Cloud Run Functions listed under The Cloud Run service. Cloud Run Functions is a serverless compute service same as other cloud providers, enabling you to run your code in a serverless mode. and pay only for the invoking period time. Avoid managing underlying compute engine layers and focus on your code result and purpose.
In today’s blog post, we will dig deeper with Cloud Run Function Concept essentials, as well as aiming to build a Cloud Run Function with a Python HTTP function by Terraform from scratch, and how to manage all the function parts.
Cloud Run concepts essential
Deployment options
Cloud Run provides multiple deployment options, all deployment options result in container images saved into the Google Artifact Registry/Container Registry service and run as a service or a job; all deployment options are highly scalable infrastructure.
Functions are deployed as Cloud Run services.[2]
Continuous source deployment from git
Cloud Run helps you configure continuous deployment from Git. Like source deployments, you can deploy sources that include a Dockerfile or are written in one of the supported language runtimes. Continuous deployment from Git is available for Cloud Run services and can be manually configured in Cloud Build for Cloud Run jobs.[3]
Cloud Run Service
The service is one of the main resources of Cloud Run. Each service is located in a specific Google Cloud region. For redundancy and failover, services are automatically replicated across multiple zones in the region they are in. A given Google Cloud project can run many services in different regions.
Each service exposes a unique endpoint, and by default, automatically scales the underlying infrastructure to handle incoming requests, though you can optionally change the scaling behavior to manual scaling if needed. You can deploy a service from a container, repository, or source code.[4]
The following diagram shows the Cloud Run resource model for services:
In the diagram,
Service A
is receiving many requests, which results in the startup and running of several instances, each running a single container. Note thatService B
is not receiving requests, so no instance is started yet.Service C
is running multiple containers per instance within each revision: note that only the ingress container receives the request. Every instance with multiple containers scales as an independent unit.
Cloud Run service revisions
Each deployment to a service creates a revision. A revision consists of one or more container images, along with configuration such as environment variables, memory limits, or request concurrency value.
Revisions are immutable: once a revision has been created, it cannot be modified.
Requests are automatically routed as soon as possible to the latest healthy service revision.[5]
Cloud Run Jobs
Each job is located in a specific Google Cloud region and executes one or more containers to completion. A job consists of one or multiple independent tasks that are executed in parallel in a given job execution.[5]
When a job is executed, a job execution is created in which all job tasks are started. All tasks in a job execution must complete successfully for the job execution to be successful. You can set timeouts on task and specify the number of retries in case of task failure. If any task exceeds its maximum number of retries, that task is marked as failed and the job is marked as failed. By default, tasks execute in parallel up to a maximum of 100, but you can specify a lower maximum if any of your backing resources require it.[5]
Every job execution executes a number of tasks in parallel. Each task runs one instance, and might retry it.[5]
The primary Cloud run Jobs source is Container Images stored into Container Registry/Artifact Registry. Cloud Run Jobs often aimed to run short-lived jobs, For example:
Run batch data processing.
Process large data process of customer nigtly and generate reports.Database schema update.
Update Database schema update at specific time.
Contrast between Cloud Run Service & Jobs
Think of Cloud Run Services (like Functions) as always-on or on-demand listeners, ready to handle individual requests. They are designed for event-driven or request/response workloads.
Cloud Run Jobs, on the other hand, are designed for finite, background processes that run to completion. They are ideal for tasks that don't require constantly listening for requests.
Think of it this way: Cloud Run Services are often about deploying continuously running applications or on-demand functions, so having the flexibility to build from source directly is very useful. Cloud Run Jobs, on the other hand, are focused on executing specific, often short-lived, tasks where a pre-built container image containing the task logic is the more straightforward and consistent approach.
The following table contrasting Cloud Run Services and Cloud Run Jobs, highlighting their key differences:
Feature | Cloud Run Service | Cloud Run Job |
Primary Purpose | Handling continuous requests, event-driven workloads, always-on applications. | Executing finite, background tasks and batch processes to completion. |
Execution Model | Runs continuously or on-demand, scales based on incoming traffic. | Executes a set of tasks in parallel within a job execution, then terminates. |
Lifecycle | Long-running or on-demand; persists to serve requests. | Finite execution; runs to completion (success or failure). |
Triggering | HTTP requests, Pub/Sub events, Cloud Storage events, direct invocation, always on. | Manual execution, Cloud Scheduler, programmatically. |
Scaling | Autoscales the number of container instances based on concurrency. | Scales the number of parallel tasks within a job execution. |
Use Cases | Web applications, APIs, event processors, microservices. | Batch data processing, database migrations, scheduled tasks, media processing, ML training (short-lived). |
Source of Code | Container image, source code (automatic containerization), connected repositories. | Primarily container image. |
Cost Model | Billed based on compute time used (CPU, memory, requests, network). | Billed based on compute time used by the job execution (CPU, memory). |
Retries | Configurable retries for request handling within a service instance. | Configurable retries for individual task failures within a job execution. |
Parallelism | Handles concurrent requests within service instances. | Executes multiple tasks in parallel within a job execution (configurable maximum). |
State Management | Stateless by design; state should be handled externally (e.g., databases, caches). | Tasks are typically stateless; any persistent state needs external management. |
Timeouts | Configurable request timeouts. | Configurable timeouts for individual tasks and the entire job execution. |
This table should give you a clear side-by-side comparison of the key characteristics of Cloud Run Services and Cloud Run Jobs.
HTTP Cloud Run Functions overview
In Cloud Run functions, you write an HTTP function when you want to invoke a function through an HTTP(S) request. To allow for HTTP semantics, you use the Function Framework and specify the HTTP Function signature to accept HTTP-specific arguments.
The following example shows a basic HTTP function source file for each runtime. See Source directory structure for information about where to locate your source code.
import functions_framework # Register an HTTP function with the Functions Framework @functions_framework.http def my_http_function(request): # Your code here # Return an HTTP response return 'OK'
In Python, you register an HTTP handler function with the Functions Framework for Python. Your HTTP handler function must accept a Flask request object as an argument and return a value that Flask can convert into an HTTP response object.
The function entry point is the name with which the handler is registered with the Functions Framework. In this example, the entry point is my_http_function.[6]
What is functions_framework
An open source FaaS (Function as a service) framework for writing portable Python functions -- brought to you by the Google Cloud Functions team.
The Functions Framework lets you write lightweight functions that run in many different environments, including: [7]
Your local development machine
Knative-based environments.
Features
Spin up a local development server for quick testing
Invoke a function in response to a request
Automatically unmarshal events conforming to the CloudEvents spec
Portable between serverless platforms
functions_framework
Installation
pip install functions_framework
Or, for deployment withrequirements.txt
file
functions-framework==3.*
functions_framework
object attributes Discovery
List function_framework
object attributes:
import functions_framework
import json
>>> object_attributes = dir(functions_framework)
>>> print(json.dumps(object_attributes, indent=4, default=str)) # Print with Humman readable
[
"BackgroundEvent",
"Callable",
"CloudEvent",
"CloudEventFunction",
"Context",
"DummyErrorHandler",
"EventConversionException",
"FunctionsFrameworkException",
"HTTPFunction",
"LazyWSGIApp",
"MissingSourceException",
"Type",
"_CLOUDEVENT_MIME_TYPE",
"_CRASH",
"_FUNCTION_STATUS_HEADER_FIELD",
"_LoggingHandler",
"__builtins__",
"__cached__",
"__doc__",
"__file__",
"__loader__",
"__name__",
"__package__",
"__path__",
"__spec__",
"_cloud_event_view_func_wrapper",
"_configure_app",
"_configure_app_execution_id_logging",
"_enable_execution_id_logging",
"_event_view_func_wrapper",
"_function_registry",
"_http_view_func_wrapper",
"_run_cloud_event",
"_typed_event",
"_typed_event_func_wrapper",
"app",
"background_event",
"cloud_event",
"cloud_exceptions",
"crash_handler",
"create_app",
"errorhandler",
"event_conversion",
"exceptions",
"execution_id",
"flask",
"from_http",
"functools",
"http", # HERE WE ARE
"inspect",
"io",
"is_binary",
"json",
"logging",
"os",
"pathlib",
"read_request",
"setup_logging",
"signature",
"sys",
"typed",
"types",
"werkzeug"
]
functions_framework
http function
Now let’s discover how to use http function, Create a main.py
file with the below content.
import flask
import flask.typing
import functions_framework
@functions_framework.http
def hello(request: flask.Request) -> flask.typing.ResponseReturnValue:
"""HTTP Cloud Function.
Args:
request (flask.Request): The request object. The request object contains
the HTTP request data.
Returns:
The response text, or any set of values that can be turned into a
Response object using `make_response`
"""
return "Hello World!"
Run the main.py
by:
functions-framework --target hello --debug
* Serving Flask app "hello" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://0.0.0.0:8080/ (Press CTRL+C to quit)
Explanation of function_framework
function
import flask.typing
Imports type definitions from Flask, which enables flask typehint for improving code readability.@functions_framework.http
This decorator, provided by thefunctions_framework
library, is crucial for defining an HTTP-triggered Cloud Function. It tells Google Cloud Functions that this function (hello) should be triggered by HTTP requests. When a user or service sends an HTTP request to the Cloud Function's URL.the
hello
function is executed HTTP Request Handling The@functions_framework.http
decorator simplifies handling HTTP requests. Without it, you'd need to write more code to set up a basic web server. When a request arrives,functions_framework
passes a Flask Request object to your function, containing all the request details. Your function processes the request and returns a response, whichfunctions_framework
sends back to the client.hello(request: flask.Request)
This is the entry point for the function; This object holds all the incoming HTTP request data, including Headers Information about the request-> flask.typing.ResponseReturnValue
This return type annotation “typehint“ indicates that the function should to return a value that Flask can use to construct an HTTP response.💡Note: the@functions_framework.http
works as a gate that handel request to flask.💡Note: the typehint does not append or change anything of the code, it’s just make the function more readable as readers get what value will be return of this function.The
hello
function receives the HTTP request data in the request object (a Flask Request object).The hello function returns the string "Hello World!".
Discover more functions_framework
functions
functions_framework
libarary include multiple function covers muliple aspects (e.g. cloud event, error handeling, etc) surface docs for more.
functions_framework
help
the Python help will assist you understand more the fuctions_framework
libarary, discover the Function section below:
>>> import functions_framework
>>> help(functions_framework)
Help on package functions_framework:
NAME
functions_framework
DESCRIPTION
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
PACKAGE CONTENTS
__main__
_cli
_function_registry
_http (package)
_typed_event
background_event
event_conversion
exceptions
execution_id
request_timeout
CLASSES
builtins.object
DummyErrorHandler
LazyWSGIApp
class DummyErrorHandler(builtins.object)
| Methods defined here:
|
| __call__(self, *args, **kwargs)
| Call self as a function.
|
| __init__(self)
| Initialize self. See help(type(self)) for accurate signature.
|
| ----------------------------------------------------------------------
| Data descriptors defined here:
|
| __dict__
| dictionary for instance variables
|
| __weakref__
| list of weak references to the object
class LazyWSGIApp(builtins.object)
| LazyWSGIApp(target=None, source=None, signature_type=None)
|
| Wrap the WSGI app in a lazily initialized wrapper to prevent initialization
| at import-time
|
| Methods defined here:
|
| __call__(self, *args, **kwargs)
| Call self as a function.
|
| __init__(self, target=None, source=None, signature_type=None)
| Initialize self. See help(type(self)) for accurate signature.
|
| ----------------------------------------------------------------------
| Data descriptors defined here:
|
| __dict__
| dictionary for instance variables
|
| __weakref__
| list of weak references to the object
FUNCTIONS
cloud_event(func: Callable[[cloudevents.http.event.CloudEvent], NoneType]) -> Callable[[cloudevents.http.event.CloudEvent], NoneType]
Decorator that registers cloudevent as user function signature type.
crash_handler(e)
Return crash header to allow logging 'crash' message in logs.
create_app(target=None, source=None, signature_type=None)
http(func: Callable[[flask.wrappers.Request], Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], Union[ForwardRef('Headers'), Mapping[str, Union[str, list[str], tuple[str, ...]]], Sequence[tuple[str, Union[str, list[str], tuple[str, ...]]]]]], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], int], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], int, Union[ForwardRef('Headers'), Mapping[str, Union[str, list[str], tuple[str, ...]]], Sequence[tuple[str, Union[str, list[str], tuple[str, ...]]]]]], ForwardRef('WSGIApplication')]]) -> Callable[[flask.wrappers.Request], Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], Union[ForwardRef('Headers'), Mapping[str, Union[str, list[str], tuple[str, ...]]], Sequence[tuple[str, Union[str, list[str], tuple[str, ...]]]]]], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], int], tuple[Union[ForwardRef('Response'), str, bytes, list[Any], Mapping[str, Any], Iterator[str], Iterator[bytes]], int, Union[ForwardRef('Headers'), Mapping[str, Union[str, list[str], tuple[str, ...]]], Sequence[tuple[str, Union[str, list[str], tuple[str, ...]]]]]], ForwardRef('WSGIApplication')]]
Decorator that registers http as user function signature type.
read_request(response)
Force the framework to read the entire request before responding, to avoid
connection errors when returning prematurely. Skipped on streaming responses
as these may continue to operate on the request after they are returned.
setup_logging()
typed(*args)
DATA
Callable = typing.Callable
Deprecated alias to collections.abc.Callable.
Callable[[int], str] signifies a function that takes a single
parameter of type int and returns a str.
The subscription syntax must always be used with exactly two
values: the argument list and the return type.
The argument list must be a list of types, a ParamSpec,
Concatenate or ellipsis. The return type must be a single type.
There is no syntax to indicate optional or keyword arguments;
such function types are rarely used as callback types.
CloudEventFunction = typing.Callable[[cloudevents.http.event.CloudEven...
HTTPFunction = typing.Callable[[flask.wrappers.Request], typing...e[st...
Type = typing.Type
Deprecated alias to builtins.type.
builtins.type or typing.Type can be used to annotate class objects.
For example, suppose we have the following classes::
class User: ... # Abstract base for User classes
class BasicUser(User): ...
class ProUser(User): ...
class TeamUser(User): ...
And a function that takes a class argument that's a subclass of
User and returns an instance of the corresponding class::
def new_user[U](user_class: Type[U]) -> U:
user = user_class()
# (Here we could write the user object to a database)
return user
joe = new_user(BasicUser)
At this point the type checker knows that joe has type BasicUser.
app = <functions_framework.LazyWSGIApp object>
errorhandler = <functions_framework.DummyErrorHandler object>
FILE
/home/user/.local/lib/python3.12/site-packages/functions_framework/__init__.py
Implementation steps
After covering the basics concepts assentials, it’s time to build and get our hands dirty 😉
Let’s start building our resource with the following directory structure
cloud-function
|__ sources
|__ main.py
|__ requirements.txt
|__ function-source.zip
|__ terraform.tf
|__ buc.tf
|__ func.tf
|__ terraform.tfvars
|__ variables.tf
|__ .terraform.lock.hcl
Build a Python functions_framework
HTTP Function
Starting with the main.py
file
import functions_framework
@functions_framework.http
def hello_http(request):
"""HTTP Cloud Function.
Args:
request (flask.Request): The request object.
<https://flask.palletsprojects.com/en/1.1.x/api/#incoming-request-data>
Returns:
The response text, or any set of values that can be turned into a
Response object using `make_response`
<https://flask.palletsprojects.com/en/1.1.x/api/#flask.make_response>.
"""
request_json = request.get_json(silent=True)
request_args = request.args
if request_json and "name" in request_json:
name = request_json["name"]
elif request_args and "name" in request_args:
name = request_args["name"]
else:
name = "World"
return "Hello {}!".format(name)
Explanation of hello_http
function:
def hello_http(request)
This line defines a function namedhello_http
. This is the entry point for your Cloud Function when it's triggered by an HTTP request. request is a parameter that will automatically be passed to your function by the Cloud Functions runtime. It's an object (specifically aflask.Request
object) that contains all the information about the incoming HTTP request, such as headers, query parameters, and the request body.request_json = request.get_json(silent=True)
This line attempts to get the JSON data from the body of the incoming HTTP request.
request.get_json()
is a method of theflask.Request
object.silent=True
is an important argument. If the request body is not valid JSON, instead of raising an exception,get_json(silent=True)
will return None. This makes the code more robust.request_args = request.args
This line retrieves the query parameters from the URL of the HTTP request. For example, if the request URL is https://your-cloud-function-url?name=Alice&age=30, then request.args would be a dictionary-like object containing
{'name': 'Alice', 'age': '30'}.
if request_json and "name" in request_json: name = request_json["name"] elif request_args and "name" in request_args: name = request_args["name"] else: name = "World"
This block of code determines the value of the name variable that will be used in the greeting. It follows this logic: Check for "name" in the JSON body: If the request body was successfully parsed as JSON (i.e. request_json is not None) and it contains a key called "name", then the value associated with that key is assigned to the name variable. If not found in JSON, check for "name" in query parameters: If the "name" key wasn't found in the JSON body (or there was no valid JSON), the code then checks if the query parameters (request_args) contain a key called "name". If it does, the corresponding value is assigned to name. Default to "World": If the "name" key is not found in either the JSON body or the query parameters, the name variable is set to the default value of "World".
return "Hello {}!".format(name)
This line constructs the HTTP response. It uses an f-string (or the
.format()
method) to insert the value of the name variable into the greeting string"Hello {}!"
. This string will be the body of the HTTP response sent back to the client that made the request.
Handel dependencies requirements.txt
It’s time to handel the requirements.txt
file as follows
functions-framework==3.*
Zipping function source code
After finalizing source code adjumsments, let’s package it inot a zip file in order to upload it to Google Bucket.
cd sources
zip -r function-source.zip .
Build GCP services by Terraform
Following the directroy structure above, let’s craft our resource. Start’s with terraform.tf
terraform.tf
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "6.28.0"
}
}
}
provider "google" {
project = "PLACE_PROJECT_ID_HERE"
region = "us-central1-a"
credentials = "/home/user/.gcp/secrets/eraki-terraform-srv-key.json" # Create an IAM user
# and extract a key file then place it any where locally.
}
variables.tf
variable "region" {}
variable "company_name" {
description = "company name"
type = string
}
variable "region_short" {
description = "the short of region name"
type = string
}
variable "solution_name" {
description = "solution name"
type = string
}
terraform.tfvars
region = "us-central1"
company_name = "eraki"
region_short = "uscen1"
solution_name = "test-1002"
buc.tf
Creating a bucket for placing the source code zip file into, then mention it with our function
# Bucket Creation
resource "google_storage_bucket" "eraki_uscen1a_bucket_test_1001" {
name = "${var.company_name}-${var.region_short}-bucket-${var.solution_name}"
location = var.region
force_destroy = true
lifecycle_rule {
condition {
age = 3
}
action {
type = "Delete"
}
}
}
# zip file upload
resource "google_storage_bucket_object" "eraki_uscen1a_buc_obj_test_1001" {
name = "${var.company_name}-${var.region_short}-buc_obj-${var.solution_name}"
# source = "/sources/function-source.zip"
source = "${path.module}/sources/function-source.zip"
bucket = google_storage_bucket.eraki_uscen1a_bucket_test_1001.name
}
func.tf
It’s time to create a generatoin 2 function based on Python 3.11 version, with mentioning the source code saved into created bucket, and allow invocation for all users.
# Function Creation
resource "google_cloudfunctions2_function" "eraki_uscen1a_function_test_1002" {
depends_on = [google_storage_bucket_object.eraki_uscen1a_buc_obj_test_1001]
name = "${var.company_name}-${var.region_short}-function-${var.solution_name}"
location = var.region
description = "Hello function"
build_config {
runtime = "python311"
entry_point = "hello_http"
source {
storage_source {
bucket = google_storage_bucket.eraki_uscen1a_bucket_test_1001.name
object = google_storage_bucket_object.eraki_uscen1a_buc_obj_test_1001.name
}
}
}
service_config {
max_instance_count = 1
available_memory = "256M"
timeout_seconds = 60
ingress_settings = "ALLOW_ALL"
}
}
resource "google_cloudfunctions2_function_iam_member" "eraki_uscen1a_function_invoker_test_1001" {
depends_on = [google_cloudfunctions2_function.eraki_uscen1a_function_test_1002]
project = google_cloudfunctions2_function.eraki_uscen1a_function_test_1002.project
location = var.region
cloud_function = google_cloudfunctions2_function.eraki_uscen1a_function_test_1002.name
role = "roles/cloudfunctions.invoker"
member = "allUsers"
}
Deploy and Test
IT’S TIME, to build and test what we have accomplished till now.
let’s start applying terraform code as follows steps:
terraform init
terraform fmt
terraform apply
After success implementation, navigate to Cloud Run service, Then open created fuction and access it’s URL.
By access the URL will appears as follows
It considered the name variable value as the defalult set as “World“, Now let’s parse a value for it in the URL as follows:
Resources
That's it, Very straightforward, very fast🚀. Hope this article inspired you, and will appreciate your feedback. Thank you
Subscribe to my newsletter
Read articles from Mohamed El Eraki directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Mohamed El Eraki
Mohamed El Eraki
Cloud & DevOps Engineer, Linux & Windows SysAdmin, PowerShell, Bash, Python Scriptwriter, Passionate about DevOps, Autonomous, and Self-Improvement, being DevOps Expert is my Aim.