The Model

The classical regression model

The model estimated by this program is best understood as a generalization of the standard linear regression model that may be written as

 

 yt = x1,t a1 + x2,t a2 +x3t a3 +...  + ut

 

with yt as the value of the dependent variable at time t , x1,t , x2,t ,... ,xn,t  as the values of the explanatory variables x1 , x2 ,... ,xn  at time t, a1 , a2 ,... ,an as the regression coefficients  for these regressors and ut as a normally i.i.d. distributed disturbance with mean zero and variance σ2.

 

In vector notation, we write the model as

 

 yt =  xt' a + ut

 

with xt'  = ( x1,t , x2,t ,... ,xn,t ) and a' = ( a1 , a2 ,... ,an ). The regression coefficients a are taken as constants.

 

The VC regression model

The model considered here generalizes this standard model by introducing the assumption that the regression coefficients may vary over time. Hence a1,t  denotes the value of the coefficient belonging to the first variable at time t, a2,t  for the second coefficient, and so forth:

 

 yt = x1,t a1,t + x2,t a2,t +x3t a3,t +...  + ut

 

With at' = ( a1,t , a2,t ,,.. , an,t )  as the vector of regression coefficients at time t and write

 

(1)         yt = xt' at + ut

 

Regarding the coefficients  a, we assume that they change slowly and unsystematically over time in the sense that the expectation of today's coefficients at  is equal to yesterday's  state at-1 . The change of coefficient i  - it's disturbance - from period t-1 to period t is denoted by vi,t, . It is assumed to be normally distributed with expectation zero and variance σi2 . With the vector of disturbances at time t is denoted by vt = ( v1,t , v2,t ,... ,vn,t )'   we write

 

(2)      at = at-1  + vt

 

The model (1), (2) generalizes the classical regression model. If the variances of the disturbances in the coefficients ( σ12,σ22,...,σn2 ) go to zero, the classical regression model is obtained as a special case.

 

In order to estimate the coefficients at  for the model (1), (2)  the vector y = (y1,y2,…,yT )' of the dependent variable and the matrix X = ( x1,x2,…,xT )' of the explanatory variables must be supplied as an input. The progranm computes the matrix A = ( a1,a2,…aT )', that is, the time-paths of the coefficients. It gives also their estimated standard deviations and confidence bands.

 

The variances σ2 and ( σ12,σ22,...,σn2 ) are calculated by a method-of-moments estimator that coincides with the maximum-likelihood estimator for large samples and has a straightforward interpretation in small samples. The matrix of the estimated coefficients  is obtained as the expectation of A given the data X and y:

 

 = E{ A | X, y }

 

The program calculates, further, the covariance matrix of the vector a' = vec( A )' = ( a1', a2' ,... ,aT' ), i.e.  E{ (a -â) (a -â)' }. The square roots of the main diagonal elements of this matrix are the standard errors provided in the output file and used to calculate the confidence bands.  If the control "calculate complete covariance matrix" is selected, the covariance matrix will be printed into a separate file.

 

Descriptive interpretation

The classical regression model is estimated by ordinary least squares: The minimization of the sum of squares

 

u'u  =  
T
Σ
t=1
  ut2

 

yields the expectations of the coefficients, i.e. â = E{ a | X, y }. The VC estimator  has a similar descriptive interpretation: It is obtained by minimizing the weighted sum of squares

 

u'u  +  
 
Σ
i
 
σ2
σi2
  vi'vi  =  
T
Σ
t=1
  ut2  +  
T
Σ
t=2
 
 
Σ
i
 
σ2
σi2
  vi,t2

 

In other words: The coefficients are selected such that the disturbances in the equation ut  and in the coefficients  vi,t  are kept as small as possible, where the changes in the coefficients are weighted with the variance ratios σ2 /σi2 . These ratios determine the decomposition and are adjusted during iteration. The progress window depicts them, and the option don't estimate variances minimizes this weighted sum of squares for given variances taken from the input file.

 

4 Function

4 References