Fitting a function to data with nonlinear least squares
In this recipe, we will show an application of numerical optimization to nonlinear least squares curve fitting. The goal is to fit a function, depending on several parameters, to data points. In contrast to the linear least squares method, this function does not have to be linear in those parameters.
We will illustrate this method on artificial data.
How to do it...
Let's import the usual libraries:
>>> import numpy as np import scipy.optimize as opt import matplotlib.pyplot as plt %matplotlib inline
We define a logistic function with four parameters:
>>> def f(x, a, b, c, d):return a / (1. + np.exp(-c * (x - d))) + b
Let's define four random parameters:
>>> a, c = np.random.exponential(size=2) b, d = np.random.randn(2)
Now, we generate random data points by using the sigmoid function and adding a bit of noise:
>>> n = 100x = np.linspace(-10., 10., n)y_model = f(x, a, b, c, d) y = y_model...