Using functions to initialize weights and biases
Weights and biases form an integral part of any deep neural network optimization and here we define a couple of functions to automate these initializations. It is a good practice to initialize weights with small noise to break symmetry and prevent zero gradients. Additionally, a small positive initial bias would avoid inactivated neurons, suitable for ReLU activation neurons.
Getting ready
Weights and biases are model coefficients which need to be initialized before model compilation. This steps require the shape
parameter to be determined based on input dataset.
How to do it...
- The following function is used to return randomly initialized weights:
# Weight Initialization weight_variable <- function(shape) { initial <- tf$truncated_normal(shape, stddev=0.1) tf$Variable(initial) }
- The following function is used to return constant biases:
bias_variable <- function(shape) { initial <- tf$constant(0.1, shape=shape) tf$Variable(initial) }