Parameter estimates
In this section, we are going to discuss some of the algorithms used for parameter estimation.
Maximum likelihood estimation
Maximum likelihood estimation (MLE) is a method for estimating model parameters on a given dataset.
Now let us try to find the parameter estimates of a probability density function of normal distribution.
Let us first generate a series of random variables, which can be done by executing the following code:
> set.seed(100) > NO_values <- 100 > Y <- rnorm(NO_values, mean = 5, sd = 1) > mean(Y)
This gives 5.002913
.
> sd(Y)
This gives 1.02071
.
Now let us make a function for log
likelihood:
LogL <- function(mu, sigma) { + A = dnorm(Y, mu, sigma) + -sum(log(A)) + }
Now let us apply the function mle
to estimate the parameters for estimating mean and standard deviation:
> library(stats4) > mle(LogL, start = list(mu = 2, sigma=2))
mu
and sigma
have been...