Hey there, welcome! 👋

Explore a bit — you’ll find cool info, handy links, and a few things about me here.

Derivation - ACF and PACF

1. AR(1) Process The AR(1) model is: $$ y_t = a_1 y_{t-1} + \epsilon_t $$ where $\epsilon_t$ is white noise. Autocorrelation Function (ACF) Derivation The autocovariance at lag $k$ is: $$ \gamma_k = E[(y_t - \mu)(y_{t-k} - \mu)] $$ For AR(1), the autocorrelation at lag $k$ is: $$ \rho_k = a_1^k $$ This means the ACF decays geometrically with lag $k$. If $a_1 > 0$, the decay is monotonic; if $a_1 < 0$, the ACF oscillates in sign but still decays in magnitude. ...

October 3, 2025 Â· Raaghav Bhardwaj

Derivation - ARMA

1. ARMA(1,1) Model Definition The ARMA(1,1) process is given by: $$ y_t = a_1 y_{t-1} + \epsilon_t + \theta_1 \epsilon_{t-1} $$ where $\epsilon_t$ is a white noise process (mean zero, constant variance, uncorrelated over time). 2. Stationarity Condition: AR Part The process is stationary if the mean, variance, and autocovariances do not depend on time. The key restriction comes from the autoregressive (AR) part: The characteristic equation is $1 - a_1 L = 0$, where $L$ is the lag operator. The root is $L = 1/a_1$. Stationarity requires: $$ |a_1| < 1 $$ ...

October 3, 2025 Â· Raaghav Bhardwaj

Autoregression

Introduction to Autoregression Let’s talk about autoregression in a way that’s as simple as possible, using examples you might see in India! Imagine you are learning to guess what comes next in a pattern. For example, if you see the numbers: 2, 4, 6, 8… what do you think comes next? (It’s 10!) Autoregression is like a smart way of guessing the next thing by looking at what happened before. It’s like when you see the weather for the last few days—if it rained for three days, you might guess it will rain again tomorrow! ...

October 1, 2025 Â· Raaghav Bhardwaj