It takes data scientists numerous hours and experiments to arrive at an optimal set of hyperparameters that are required for best model performance. This process is mostly based on trial and error.
Although GridSearch is one of the techniques that is traditionally used by data scientists, it suffers from the curse of dimensionality. For example, if we have two hyperparameters, with each taking five possible values, we're looking at calculating objective function 25 times (5 x 5). As the number of hyperparameters grows, the number of times that the objective function is computed blows out of proportion.
Random Search addresses this issue by randomly selecting values of hyperparameters, without doing an exhaustive search of every single combination of hyperparameters. This paper by Bergstra et al. claims that a random search of the...