Starting your server
Before we start with Fastify, it is necessary to set up a developing environment. To create an empty project with npm
, open your system’s shell
and run the following commands:
mkdir fastify-playground cd fastify-playground/ npm init –-yes npm install fastify
These commands create an empty folder and initialize a Node.js project in the new directory; you should see a successful message on each npm <
command>
execution.
Now, we are ready to start an HTTP server with Fastify, so create a new starts.cjs
file and check out these few lines:
const fastify = require('fastify') // [1] const serverOptions = { // [2] logger: true } const app = fastify(serverOptions) // [3] app.listen({ port: 8080, host: '0.0.0.0' }) .then((address) => { // [4] // Server is now listening on ${address} })
Let’s break up each of the elements of this code. The imported framework is a factory function [1] that builds the Fastify server root application instance.
Book code style
All the book’s code snippets are written in CommonJS (CJS). The CJS syntax has been preferred over ECMAScript Modules (ESM) because it is not yet fully supported by tools such as application performance monitoring (APM) or test frameworks. Using the require
function to import the modules lets us focus on code, avoiding issues that can’t be covered in this book.
The factory accepts an optional JavaScript object input [2] to customize the server and its behavior—for instance, supporting HTTPS and the HTTP2 protocol. You will get a complete overview of this matter later on in this chapter. The application instance, returned by the factory, lets us build the application by adding routes to it, configuring and managing the HTTP server’s start and stop phases.
After the server has built our instance [3], we can execute the listen
method, which will return a Promise
. Awaiting it will start the server [4]. This method exposes a broad set of interfaces to configure where to listen for incoming requests, and the most common is to configure the PORT
and HOST
.
listen
Calling listen
with the 0.0.0.0
host will make your server accept any unspecified IPv4 addresses. This configuration is necessary for a Docker container application or in any application that is directly exposed on the internet; otherwise, external clients won’t be able to call your HTTP server.
To execute the previous code, you need to run this command:
node starts.cjs
This will start the Fastify server, and calling the http://localhost:8080/
URL with an HTTP client or just a browser must show a 404 response because we didn’t add any route yet.
Congratulations—you have started your first Fastify server! You can kill it by pressing the Ctrl + C or Cmd+ C buttons.
We have seen the root instance component in action. In a few lines of code, we were able to start an HTTP server with no burden! Before continuing to dig into the code, in the next section, we will start understanding what Fastify does under the hood when we start it.
Lifecycle and hooks overview
Fastify implements two systems that regulate its internal workflow: the application lifecycle and the request lifecycle. These two lifecycles trigger a large set of events during the application’s lifetime. Listening to those events will let us customize the data flow around the endpoints or simply add monitoring tools.
The application lifecycle tracks the status of the application instance and triggers this set of events:
- The
onRoute
event acts when you add an endpoint to the server instance - The
onRegister
event is unique as it performs when a new encapsulated context is created - The
onReady
event runs when the application is ready to start listening for incoming HTTP requests - The
onClose
event executes when the server is stopping
All these events are Fastify’s hooks. More specifically, a function that runs whenever a specific event happens in the system is a hook. The hooks that listen for application lifecycle events are called application hooks. They can intercept and control the application server boot phases, which involve:
- The routes’ and plugins’ initialization
- The application’s start and close
Here is a quick usage example of what happens after adding this code before the listen
call in the previous code block:
app.addHook('onRoute', function inspector(routeOptions) { console.log(routeOptions) }) app.addHook('onRegister', function inspector(plugin, pluginOptions) { console.log('Chapter 2, Plugin System and Boot Process') }) app.addHook('onReady', function preLoading(done) { console.log('onReady') done() }) app.addHook('onClose', function manageClose(done) { console.log('onClose') done() })
We see that there are two primary API interfaces for these hooks:
- The
onRoute
and theonRegister
hooks have some object arguments. These types can only manipulate the input object adding side effects. A side effect changes the object’s properties value, causing new behavior of the object itself. - The
onReady
andonClose
hooks have a callback style function input instead. Thedone
input function can impact the application’s startup because the server will wait for its completion until you call it. In this timeframe, it is possible to load some external data and store it in a local cache. If you call the callback with an error object as the first parameter,done(new Error())
, the application will listen, and the error will bubble up, crashing the server startup. So, it’s crucial to load relevant data and manage errors to prevent them from blocking the server.
As presented in the preceding example, running our source code will print out only the onReady
string in the console. Why are our hooks not running? This happens because the events we are listening to are not yet triggered. They will start working by the end of this chapter!
Note that whenever a Fastify interface exposes a done
or next
argument, you can omit it and provide an async function instead. So, you can write:
app.addHook('onReady', async function preLoading() { console.log('async onReady') // the done argument is gone! })
If you don’t need to run async code execution such as I/O to the filesystem or to an external resource such as a database, you may prefer the callback style. It provides a simple function done
within the arguments, and is slightly more performant than an async function!
You can call the addHook()
method multiple times to queue the hooks’ functions. Fastify guarantees to execute them all in the order of addition.
All these phases can be schematized into this execution flow:
Figure 1.1 – Application lifecycle
At the start of the application, the onRoute
and onRegister
hooks are executed whenever a new route or a new encapsulated context is created (we will discuss the encapsulated context by the end of this chapter, in the Adding a basic plugin instance section). The dashed lines in Figure 1.1 mean that these hooks’ functions are run synchronously and are not awaited before the server starts up. When the application is loaded, the onReady
hooks queue is performed, and the server will start listening if there are no errors during this startup phase. Only after the application is up and running will it be able to receive stop events. These events will start the closing stage, during which the onClose
hooks’ queue will be executed before stopping the server. The closing phase will be discussed in the Shutting down the application section.
The request lifecycle, instead, has a lot more events. But keep calm—Chapter 4 talks about them extensively, and you will learn how to use them, why they exist, and when you should use them. The hooks listening to the request’s lifecycle events are request and reply hooks. This lifecycle defines the flow of every HTTP request that your server will receive. The server will process the request in two phases:
- The routing: This step must find the function that must evaluate the request
- The handling of the request performs a set of events that compose the request lifecycle
The request triggers these events in order during its handling:
onRequest
: The server receives an HTTP request and routes it to a valid endpoint. Now, the request is ready to be processed.preParsing
happens before the evaluation of the request’s body payload.- The
preValidation
hook runs before applying JSON Schema validation to the request’s parts. Schema validation is an essential step of every route because it protects you from a malicious request payload that aims to leak your system data or attack your server. Chapter 5 discusses this core aspect further and will show some harmful attacks. preHandler
executes before the endpoint handler.preSerialization
takes action before the response payload transformation to a String, a Buffer, or a Stream, in order to be sent to the client.onError
is executed only if an error happens during the request lifecycle.onSend
is the last chance to manipulate the response payload before sending it to the client.onResponse
runs after the HTTP request has been served.
We will see some examples later on. I hope you have enjoyed the spoilers! But first, we must deep dive into the Fastify server to understand how to use it and how it interacts with the lifecycle.
The root application instance
The root application instance is the main API you need to create your API. All the functions controlling the incoming client’s request must be registered to it, and this provides a set of helpers that let you best organize the application. We have already seen how to build it using the const app = fastify(serverOptions)
statement. Now, we will present a general overview of the possible options to configure and use this object.
Server options
When you create a Fastify server, you have to choose some key aspects before starting the HTTP server. You may configure them, providing the option input object, which has many parameters listed in the Fastify documentation (https://www.fastify.io/docs/latest/Reference/Server/).
Now, we will explore all the aspects you can set with this configuration:
- The
logger
parameter gives you the control to adapt the default logger to your convenience and system infrastructure to archive distributed logging and meaningful logs; Chapter 11 will discuss broadly how to best set up these parameters. https: object
sets up the server to listen for Transport Layer Security (TLS) sockets. We will see some examples later on in Chapter 7.keepAliveTimeout
,connectionTimeout
, andhttp2SessionTimeout
are several timeout parameters after which the HTTP request socket will be destroyed, releasing the server resources. These parameters are forwarded to the standard Node.jshttp.Server
.- Routing customization to provide stricter or laxer constraints—for instance, a case-insensitive URL and more granular control to route a request to a handler based on additional information, such as a request header instead of an HTTP method and HTTP URL. We will cover this in Chapter 3.
maxParamLength: number<length>
limits the path parameter string length.bodyLimit: number<byte>
caps the request body payload size.http2: boolean
starts an HTTP2 server, which is useful to create a long-lived connection that optimizes the exchange of data between client and server.- The
ajv
parameter tweaks the validation defaults to improve the fit of your setup. Chapter 5 will show you how to use it. - The
serverFactory: function
manages the low-level HTTP server that is created. This feature is a blessing when working in a serverless environment. - The
onProtoPoisoning
andonConstructorPoisoning
default security settings are the most conservative and provide you with an application that's secure by default. Changing them is risky and you should consider all the security issues because it impacts the default request body parser and can lead to code injection. Chapter 4 will show you an example of these parameters in action.
Are you overwhelmed by all these options? Don’t worry. We are going to explore some of them with the following examples. The options provided not only allow you to adapt Fastify to a wide range of general use cases but extend this possibility to edge cases as well; usually, you may not need to configure all these parameters at all. Just remember that default settings are ready for production and provide the most secure defaults and the most useful utilities, such as 404 Not Found
and 500
Error
handlers.
Application instance properties
The Fastify server exposes a set of valuable properties to access:
- An
app.server
getter that returns the Node.js standardhttp.Server
orhttps.Server
. app.log
returns the application logger that you can use to print out meaningful information.app.initialConfig
to access the input configuration in read-only mode. It will be convenient for plugins that need to read the server configuration.
We can see them all in action at the server startup:
await app.listen({ port: 0, host: '0.0.0.0' }) app.log.debug(app.initialConfig, 'Fastify listening with the config') const { port } = app.server.address() app.log.info('HTTP Server port is %i', port)
Setting the port parameter to 0
will ask the operating system to assign an unused host’s port to your HTTP server that you can access through the standard Node.js address()
method. Running the code will show you the output log in the console, which shows the server properties.
Unfortunately, we won’t be able to see the output of the debug
log. The log doesn’t appear because Fastify is protecting us from misconfiguration, so, by default, the log level is at info
. The log-level values accepted by default are fatal
, error
, warn
, info
, debug
, trace
, and silent
. We will see a complete log setup in Chapter 11.
So, to fix this issue, we just need to update our serverConfig
parameter to the following:
const serverOptions = { logger: { level: 'debug' } }
By doing so, we will see our log printed out on the next server restart! We have seen the instance properties so far; in the next section, we will introduce the server instance methods.
Application instance methods
The application instance lets us build the application, adding routes and empowering Fastify’s default components. We have already seen the app.addHook(eventName, hookHandler)
method, which appends a new function that runs whenever the request lifecycle or the application lifecycle triggers the registered event.
The methods at your disposal to create your application are:
app.route(options[, handler])
adds a new endpoint to the server.app.register(plugin)
adds plugins to the server instance, creating a new server context if needed. This method provides Fastify with encapsulation, which will be covered in Chapter 2.app.ready([callback])
loads all the applications without listening for the HTTP request.app.listen(port|options [,host, callback])
starts the server and loads the application.app.close([callback])
turns off the server and starts the closing flow. This generates the possibility to close all the pending connections to a database or to complete running tasks.app.inject(options[, callback])
loads the server until it reaches the ready status and submits a mock HTTP request. You will learn about this method in Chapter 9.
This API family will return a native Promise
if you don’t provide a callback parameter. This code pattern works for every feature that Fastify provides: whenever there is a callback argument, you can omit it and get back a promise instead!
Now, you have a complete overview of the Fastify server instance component and the lifecycle logic that it implements. We are ready to use what we have read till now and add our first endpoints to the application.