In one of his talks at QCon, John Allspaw mentioned using Holt-Winter exponential smoothing on various monitoring instances. Wikipedia has a good entry on the subject, of course, but the basic idea is to take a noisy/spikey time series and smooth it out, so that unexpected changes will stand out even more. That's often initially done by taking a moving average, so say averaging the last 7 days of data and using that as the current day's value. More complicated schemes weight that average, so that the older data contributes less.
Simple exponential smoothing effectively takes this weighted average further, with more recent values being exponentially more important than older ones. However, this has problems in the face of a long term trend, so double exponential includes a factor for the general tendencies in the data (e.g. an increasing trend over time). Triple exponential, which we've using here, also includes a factor to consider seasonal changes, so I thought I'd give that one a go at implementing. Each of those three smoothing aspects have their own weighting factor, alpha, beta and gamma, that control how much of an impact they have, and by setting each to 0 we can have the same code do any one of the three algorithms. Below I've broken out the function into it's component parts, but you can see the whole thing on github
We'll give it a go on some web data that has an unexpected spike, see how visible that is against the timeline. The algorithm is pretty simple, but we need to setup a bunch of variables first. We start off by calculating an initial trend value by looking at the difference in the average values over the first two 'seasons' (the length being a configurable parameter of the function).
Next we create an initial value for the 'level' part, the direct data smoothing parameter, map the data for the season index, and calculate the seasonal changes for the first period.
Finally, we actually run the smoothing. This loops over the data, updates trend, level and season values for the three elements of the smoothing, and finally combines them to calculate the smoothed value, factoring in the weighting constants. By continuing beyond the end of the data, we can even use this to project into the future and make a forecast!
John used this kind of smoothing at Etsy in combination with error bars to look for unusual events, and trigger their monitoring systems. One thing I noticed from trying a quick implementation is that the length of time considered for the season can have a big effect on the smoothing, as can the values of the $alpha, $beta and $gamma constants, so some tweaking may be required if using a similar technique on your own data.
If we did want to make some sort of triggering based on data, we'd need to create confidence intervals as well. We can do that with an extra array in the main holt winters loop that is updated like this:
This is going to track how much our data is deviating from the smoothed value, and factor in seasonality in that. We can use a number of these values added and subtracted to the smoothed value to create confidence bars, and signal if our data goes outside that. We'll add and subtract three multiples of deviation score, which gives us error bars that look something like the below. Note that as the data gets more variable, the confidence bars open up to respect the general increased volatility, but when the data isn't changing much day to day the error bars are pretty tight.