# Time Series

started 13th October 2016

I built a circuit which generates pulses at random times (see Phase Noise). To test how random they were I tried autocorrelation of the times between pulses and was surprised to find the lag 1 value was always -0.5. Eventually I read in [1] ("5.5.4. The Lag 1 Autocorrelation") that this value is expected. I could not see why and I could not find a derivation online. Hence this page.

The autocorrelation at lag k is defined as:

 rho_k=(E[(z_t-mu)(z_(t+k)-mu)])/sigma_z^2

where  z_t  is the time series, μ is its mean value,  σ_z^2  is its variance, and E denotes the expected value.

For a stationary process

 E[z_t-mu]=0
 sigma_z^2=E[(z_t-mu)^2]

for all t.

By the definition of sigma rho_0 equals 1.

If  z_t is a random variable, then rho_k equals 0 because the values of z_t and z_(t+1) are independent:

 rho_k=(E[(z_t-mu)]E[(z_(t+k)-mu)])/sigma_z^2

and  E[(z_t-mu)] is zero.

 E[(z_t-mu)(z_s-mu)]=0 for t!=s

If  z_t=r_t-r_(t-1)+mu  with  r_t  a random variable with mean 0 then

 rho_1=(E[(r_t-r_(t-1))(r_(t+1)-r_t)])/sigma_z^2

Expanding the argument of E and removing the terms which are zero by the independence assumption

 rho_1=(-E[r_t^2])/sigma_z^2
 sigma_z^2=E[(r_t-r_(t-1))^2]
 sigma_z^2=2E[r_t^2]
 rho_1=-1/2

The best intuitive way of looking at this I can think of is to imagine three data values, the first and last will average out to their mean positions, the correlation between the difference between the middle point and the first and the difference between the middle point and last will always consist of a value less than the mean multiplied by a value greater than the mean.

Put it another way average values with a correlation of 1 with those with a correlation of 0 get a correlation of 0.5.

Credits

References