U.S Unemployment Time-Series Modelling (Part 1)

One of the many benefits of improving economic forecasts is being able to trade releases with better information through forex and stocks.  Certain sites such as Forexfactory provide a forecast parameter and I was able to play around and figure out some just use standard ARIMA models.  In Part 1, I will show how to estimate unemployment rate log changes and Part 2, I will implement this through a modified BP neural network (if i can get it to work...).  I will be benchmarking my residuals with a standard ARIMA model along with an exogenous regressor (initial claims).  The data was obtained from the Bloomberg terminals and both time-series are seasonally adjusted.

 

Firstly, I will introduce our extra regressor.  U.S Unemployment Initial claims act as a good leading predictor for monthly unemployment rate releases.  See below plot for the last 30 years

 

Green is Claims, Yellow is Rate.
Green is Claims, Yellow is Rate.

 

Using a cross correlation plot of % changes since this creates a stationary process, we can see which lag has the most significant correlation:

 

Lag > 0 indicate Claims precede Rate
Lag > 0 indicate Claims precede Rate

 

From this we can pick out Lag 3 as being the most significant and will use that in our model.  Now we can go ahead and form/estimate our model.  A Non-seasonal ARIMA model is a popular time-series forecasting technique used by analysts.  In functional notation it can be denoted by ARIMA(p, d, q) where p is the number of Autoregressive lag, d is the differencing lag and q is the moving average lag.  In general, the ARIMA model can be adjusted as such:

where is the lag operator, is the polynomial of Autoregressive lag, is the differencing terms and is the polynomial terms of Moving average lag.

 

Since my data is represented as % changes, our time-series can be assumed to be stabilized and stationary (ADF Test rejected  at ), thus .  I now need to find the appropriate and terms.  Using the Box-Jenkins methodology, I will plot the Sample ACF and PACF to identify our lagged terms:

 

Box-Jenkins Model identification
Box-Jenkins Model identification

 

Inspecting the SACF, we can see that the Lagged values are mostly significant between 2 and 10.  Without significant drop offs indicating towards an Autoregressive model.  The PACF indicate that the significant lags drop off after 5 in the time-series.  Thus, any significant lag after that in the SACF plots can be explained through time-dependence in previous lags such as .  Our moving average model cannot be directly determined through this plot due to no single sharp rise.  However, the eventual decay to zero after a few lags signal that there is a moving average component in the time-series.  We can impose an upper bound on (determined from last few significant lag in SACF) such that we create  ARIMA models with moving average lags and evaluate its performance with Akaike information criterion (AIC).  Below is the Matlab code on how to do that, I sampled monthly initial claims and Rate from 1984 to 2014.  Note that is a presample data, I'm not 100% sure what it is other than using it do begin the estimation process for parameters.  This term is just the last 3 (arima model AR order) returns from Unemployment rate.   is our claims in this case.

 

1
2
3
4
5
6
7
8
aic = [];
n = numel(Y);
for q=0:8
  mdl = arima(3,0,q);
  [e c ll] = estimate(mdl,Y, 'X', X, 'Y0', Y0);
  k = 3+q+2;
  aic = [aic; aicbic(ll,k)+(2*k*(k+1))/(n-k-1)];
end

 

Plotting the AICc for each model, we can conclude that five moving average term is the best fitted model for this sample data set.  AIC is a goodness of fit measure for predictive, usually regression, models.  It can be defined as

where is the log likelihood of the model and is the number of parameters.  The AIC trades off likelihood of an estimated parameter () with the complexity of a model that may have high variance () (such as neural networks).  For a finite sample size we will use AICc which accounts for the sample size and app.  In our case, we do not need to use AICc as we have a relatively large sample size.

Our AIC can be calculated with aicbic function in matlab where ll is our log likelihood and where 2 is constant and our initials claims regressor.  Below is the AIC plot:

 

Remember that 1 on the plot means 0 moving average terms
Remember that 1 on the plot means 0 moving average terms

 

Therefore our final model is an ARIMAX model with .  Here is a fitted plot of our original data set along with a plot of the model residuals:

MSE only pertains to this data set fit.
MSE only pertains to this data set fit.
Stationary but non-normal?
Stationary but non-normal?
Estimated Parameters
Estimated Parameters

 

Our final MSE turns out to be 0.00054122 for the model which isn't bad but can be improved.  Residuals look stationary but non-normal distribution which may hint that some parts of the data can still be modelled..  In the next part I will look at modelling this equation with a neural network for more robust predictions.  Hopefully the nonlinearity will capture the larger fluctuations in the data set.

Stay tuned!

Leave a Reply

Your email address will not be published. Required fields are marked *