Do you remember when we used to build and deploy our web applications on a server? We were responsible for managing its resources, and that led to a few issues.

Being charged for a full-time server - even when it wasn’t running or serving any requests. All the time spent maintaining servers, their resources, and dependencies, and applying all the security updates... Not to mention having to adjust resources to deal with scalability and setting up the correct alert to handle the scale.

All those tasks ended up distracting developers from their focus: building their software. Small engineering teams had to stray from their primary mission to absorb those tasks. In contrast, bigger teams had to divide them up between different positions, slowing down the development and shipping processes of the software or application.

As developers have been looking for a solution to these problems, the Serverless ecosystem, particularly Serverless Functions, came in.

A step closer to microservice infrastructures with Serverless

Microservices have become necessary to build agile applications and create a working environment that allows engineers to form autonomous teams.

Microservices are a way to design your infrastructure with independent services that have a dedicated purpose. The idea is to focus on building individual services that do one thing and do it well.

As a response to demanding and scaling startups moving away from monolithic architectures, we spent the last two years building our new Serverless ecosystem to provide an autonomous way of running an application: Scaleway Serverless Containers and Scaleway Serverless Functions.

What are Serverless Containers and Serverless Functions


Serverless is a model in which the code is executed on-demand and invoked when triggered by request. Developers can configure all of this using their Scaleway Console. That code can represent an entire application, but the most common use of Serverless Functions is to integrate application functionalities as a unit. You can add multiple functions to your application or your software and use the same function in various applications. The code is run inside stateless containers triggered by events (CRON, HTTP request, etc.).

Our goal is to give developers an abstraction layer to run their software without having to manage servers. It is associated with the "on-demand" functionality to enable infrastructures to be more flexible and pay only for used resources. You send us your code in the form of a function, and we will set it up for you and scale it for you when needed.

For example, let’s say you have an application running where the user needs to upload an image. You could set up triggers to execute the function whenever the user uploads the image, and with Serverless Functions, stop running as soon as the task is complete. Then, the functions get triggered again when a new one is uploaded.
Without Serverless Functions, you would need all the parts of your application to run constantly, instead of configuring specific components to run only when needed - thus saving resources.

How to take the best advantage out of Serverless Functions

A front-end application would not be interesting for Serverless Functions: it needs to run all the time. But the authentication service is only requested occasionally and could be a good candidate for serverless.

Serverless Functions can help integrate third-party services because the connection with services has been made easy-to-use (GitHub, Slack, etc.) with Serverless Functions and webhook. You can also process files stored on Object Storage, use Serverless Functions as a route destination to execute an action based on IoT events, or even process data to transform datasets.

Serverless Functions is also great for automating operations (manage your instance, your registry, etc.) via Scaleway’s API.


Since your function is running inside a container set to respond to events on demand, there is some latency. This is referred to as a Cold Start. Your container might be kept around for a little while after your function has completed execution. If another event is triggered during this time, it will respond far more quickly, typically known as a Warm Start.

How do Scaleway’s Serverless Functions work?

Scaleway’s Serverless ecosystem is built with Knative components on top of running Kubernetes. Knative is an open-source, Kubernetes-based platform that can be used to build, deploy, and manage modern Serverless workloads. The best practices from demanding cases have been codified into into it.

Serverless Function is compatible with the Serverless framework, and supports Python, Go, and Node.js.


Learn more

Here is our documentation page if you want to explore how to use Serverless Functions and get started on it.

If you have questions about Serverless, we have a strong community on our Slack that you are welcome to join, and our Excellence team is ready to answer your tickets or questions on our live chat.