In this recipe, we present an alternative to Gradient Descent (GD) and LBFGS by using Normal Equations to solve linear regression. In the case of normal equations, you are setting up your regression as a matrix of features and vector of labels (dependent variables) while trying to solve it by using matrix operations such as inverse, transpose, and so on.
The emphasis here is to highlight Spark's facility for using Normal Equations to solve Linear Regression and not the details of the model or generated coefficients.