Maximum likelihood estimation
Logistic regression works on the principle of maximum likelihood estimation; here, we will explain in detail what it is in principle so that we can cover some more fundamentals of logistic regression in the following sections. Maximum likelihood estimation is a method of estimating the parameters of a model given observations, by finding the parameter values that maximize the likelihood of making the observations, this means finding parameters that maximize the probability p of event 1 and (1-p) of non-event 0, as you know:
probability (event + non-event) = 1
Example: Sample (0, 1, 0, 0, 1, 0) is drawn from binomial distribution. What is the maximum likelihood estimate of μ?
Solution: Given the fact that for binomial distribution P(X=1) = μ and P(X=0) = 1- μ where μ is the parameter:
Here, log is applied to both sides of the equation for mathematical convenience; also, maximizing likelihood is the same as the maximizing log of likelihood:
Determining the maximum...