Detecting changepoints in time series
A changepoint can be defined as a point in time when the probability distribution of a process or time series changes, for example, when there is a change to the mean in the series.
In this recipe, we will use the CUSUM (cumulative sum) method to detect shifts of the means in a time series. The implementation used in the recipe has two steps:
- Finding the changepoint – an iterative process is started by first initializing a changepoint in the middle of the given time series. Then, the CUSUM approach is carried out based on the selected point. The following changepoint is located where the previous CUSUM time series is either maximized or minimized (depending on the direction of the changepoint we want to locate). We continue this process until a stable changepoint is located or we exceed the maximum number of iterations.
- Testing its statistical significance – a log-likelihood ratio test is used to test if the mean of the given time series...