Machine Learning (Chapter 27): Maximum Likelihood Estimate (MLE)
Machine Learning (Chapter 27): Maximum Likelihood Estimate (MLE)
Introduction to Maximum Likelihood Estimate (MLE)
Maximum Likelihood Estimation (MLE) is a fundamental statistical method used in machine learning and statistics to estimate the parameters of a probability distribution or statistical model. The core idea behind MLE is to find the parameter values that maximize the likelihood function, which measures how likely it is to observe the given data under different parameter values.
Likelihood Function
Given a dataset and a probabilistic model with a parameter , the likelihood function is defined as the probability of observing the data given the parameter . Mathematically, it is expressed as:
For convenience, the log-likelihood function is often used, which is the natural logarithm of the likelihood function:
Maximum Likelihood Estimation
The goal of MLE is to find the value of that maximizes the likelihood function. Formally, this can be expressed as:
Example: MLE for Gaussian Distribution
Let's consider a simple example where we estimate the mean and variance of a Gaussian distribution using MLE. Given a dataset that is assumed to be drawn from a Gaussian distribution with mean and variance , the probability density function is:
The likelihood function for the dataset is:
Taking the logarithm of the likelihood function, we get the log-likelihood:
To find the MLE estimates and , we take the partial derivatives of the log-likelihood function with respect to and , and set them to zero:
Solving these equations, we obtain:
Python Implementation
Below is a Python implementation of MLE for estimating the parameters of a Gaussian distribution.
python:
import numpy as np
# Sample data
data = np.array([2.3, 2.5, 3.1, 4.0, 4.2, 5.5, 5.7, 6.0])
# MLE estimation of mean
mu_hat = np.mean(data)
# MLE estimation of variance
sigma2_hat = np.var(data, ddof=0) # ddof=0 for MLE (population variance)
print(f"Estimated mean (MLE): {mu_hat}")
print(f"Estimated variance (MLE): {sigma2_hat}")
Output:
java:
Estimated mean (MLE): 4.1625
Estimated variance (MLE): 1.70859375
Conclusion
Maximum Likelihood Estimation is a powerful and widely-used method for parameter estimation in statistical models. By maximizing the likelihood function, MLE provides estimates of model parameters that best explain the observed data. The example provided demonstrates how MLE can be applied to estimate the mean and variance of a Gaussian distribution, and the corresponding Python implementation illustrates how these concepts can be applied in practice.
This chapter has covered the mathematical foundations and practical application of MLE, setting the stage for more complex models and estimation techniques in the following chapters.

Comments
Post a Comment