Dropout can be called as a function through tf.nn.dropout(x, rate, ...) (refer to the documentation at https://www.tensorflow.org/api_docs/python/tf/nn/dropout) to directly obtain a tensor with values randomly dropped, or as a layer through tf.keras.layers.Dropout() (refer to the documentation at https://www.tensorflow.org/api_docs/python/tf/layers/dropout), which can be added to neural models. By default, tf.keras.layers.Dropout() is only applied during training (when the layer/model is called with the training=True parameter) and is deactivated otherwise (forwarding the values without any alteration).
Dropout layers should be added directly after layers we want to prevent from overfitting (as dropout layers will randomly drop values returned by their preceding layers, forcing them to adapt). For instance, you can apply dropout (for example, with a ratio, ) to a fully connected layer in Keras, as shown in the following code block:
model = Sequential...