This document describes best practices for designing, implementing, testing,
and deploying Cloud Functions.
Correctness
This section describes general best practices for designing and implementing
Cloud Functions.
Write idempotent functions
Your functions should produce the same result even if they are called multiple
times. This lets you retry an invocation if the previous invocation fails
partway through your code. For more information, see
retrying event-driven functions
.
Do not start background activities
Background activity is anything that happens after your function has terminated.
A function invocation finishes once the function returns or otherwise signals
completion, such as by calling the
callback
argument in Node.js event-driven
functions. Any code run after graceful termination cannot access the CPU and
will not make any progress.
In addition, when a subsequent invocation is executed in the same environment,
your background activity resumes, interfering with the new invocation. This may
lead to unexpected behavior and errors that are hard to diagnose. Accessing
the network after a function terminates usually leads to connections being reset
(
ECONNRESET
error code).
Background activity can often be detected in logs from individual invocations,
by finding anything that is logged after the line saying that the invocation
finished. Background activity can sometimes be buried deeper in the code,
especially when asynchronous operations such as callbacks or timers are present.
Review your code to make sure all asynchronous operations finish before you
terminate the function.
Always delete temporary files
Local disk storage in the temporary directory is an in-memory filesystem. Files
that you write consume memory available to your function, and sometimes persist
between invocations. Failing to explicitly delete these files may eventually
lead to an out-of-memory error and a subsequent cold start.
You can see the memory used by an individual function by selecting it in the
list of functions
in the
GCP Console and choosing the
Memory usage
plot.
Do not attempt to write outside of the temporary directory, and be sure to use
platform/OS-independent methods to construct file paths.
You can reduce memory requirements when processing larger files using pipelining.
For example, you can process a file on Cloud Storage by creating a read stream,
passing it through a stream-based process, and writing the output stream
directly to Cloud Storage.
Functions Framework
When you deploy a function, the Functions Framework is automatically added as a
dependency, using its current version. To ensure that the same dependencies
are installed consistently across different environments, we recommend that
you pin your function to a specific version of the Functions Framework.
To do this, include your preferred version in the relevant lock file
(for example,
package-lock.json
for Node.js, or
requirements.txt
for Python).
This section provides guidelines on how to use tools to implement, test, and
interact with Cloud Functions.
Local development
Function deployment takes a bit of time, so it is often faster to test the code
of your function locally.
Firebase developers can use the
Firebase CLI Cloud Functions Emulator
.
Use Sendgrid to send emails
Cloud Functions does not allow outbound connections on port 25, so you cannot
make non-secure connections to an SMTP server. The recommended way to send
emails is to use
SendGrid
. You can find other options
for sending email in the
Sending Email from an Instance
tutorial for Google Compute Engine.
This section describes best practices for optimizing performance.
Use dependencies wisely
Because functions are stateless, the execution environment is often initialized
from scratch (during what is known as a
cold start
). When a cold start occurs,
the global context of the function is evaluated.
If your functions import modules, the load time for those modules can add to the
invocation latency during a cold start. You can reduce this latency, as well as
the time needed to deploy your function, by loading dependencies correctly and
not loading dependencies your function doesn't use.
Use global variables to reuse objects in future invocations
There is no guarantee that the state of a Cloud Function will be preserved for
future invocations. However, Cloud Functions often recycles the execution
environment of a previous invocation. If you declare a variable in global scope,
its value can be reused in subsequent invocations without having to be
recomputed.
This way you can cache objects that may be expensive to recreate on each
function invocation. Moving such objects from the function body to global scope
may result in significant performance improvements. The following example
creates a heavy object only once per function instance, and shares it across all
function invocations reaching the given instance:
Node.js
console.log('Global scope');
const perInstance = heavyComputation();
const functions = require('firebase-functions');
exports.function = functions.https.onRequest((req, res) => {
console.log('Function invocation');
const perFunction = lightweightComputation();
res.send(`Per instance: ${perInstance}, per function: ${perFunction}`);
});
Python
import time
from firebase_functions import https_fn
# Placeholder
def heavy_computation():
return time.time()
# Placeholder
def light_computation():
return time.time()
# Global (instance-wide) scope
# This computation runs at instance cold-start
instance_var = heavy_computation()
@https_fn.on_request()
def scope_demo(request):
# Per-function scope
# This computation runs every time this function is called
function_var = light_computation()
return https_fn.Response(f"Instance: {instance_var}; function: {function_var}")
This HTTP function takes a request object (
flask.Request
), and returns
the response text, or any set of values that can be turned into a
Response
object using
make_response
.
It is particularly important to cache network connections, library references,
and API client objects in global scope.
See
Optimizing Networking
for examples.
Do lazy initialization of global variables
If you initialize variables in global scope, the initialization code will always
be executed via a cold start invocation, increasing your function's latency.
In certain cases, this causes intermittent timeouts to the services being called
if they are not handled appropriately in a
try
/
catch
block. If
some objects are not used in all code paths, consider initializing them lazily
on demand:
Node.js
const functions = require('firebase-functions');
let myCostlyVariable;
exports.function = functions.https.onRequest((req, res) => {
doUsualWork();
if(unlikelyCondition()){
myCostlyVariable = myCostlyVariable || buildCostlyVariable();
}
res.status(200).send('OK');
});
Python
from firebase_functions import https_fn
# Always initialized (at cold-start)
non_lazy_global = file_wide_computation()
# Declared at cold-start, but only initialized if/when the function executes
lazy_global = None
@https_fn.on_request()
def lazy_globals(request):
global lazy_global, non_lazy_global
# This value is initialized only if (and when) the function is called
if not lazy_global:
lazy_global = function_specific_computation()
return https_fn.Response(f"Lazy: {lazy_global}, non-lazy: {non_lazy_global}.")
This HTTP function uses lazily-initialized globals. It takes a request object
(
flask.Request
), and returns the response text, or any set of values that
can be turned into a
Response
object using
make_response
.
This is particularly important if you define several functions in a single file,
and different functions use different variables. Unless you use lazy
initialization, you may waste resources on variables that are initialized but
never used.
Reduce cold starts by setting a minimum number of instances
By default, Cloud Functions scales the number of instances based on the
number of incoming requests. You can change this default behavior by setting a
minimum number of instances that Cloud Functions must keep ready to
serve requests. Setting a minimum number of instances reduces cold starts of
your application. We recommend setting a minimum number of instances if your
application is latency-sensitive.
See
Control scaling behavior
for more information on these runtime options.
Additional resources
Find out more about optimizing performance in the "Google Cloud Performance
Atlas" video
Cloud Functions Cold Boot Time
.