Autoregressive Models – Part II (Introducing the ARIMA and SARIMA models)
- dadoaentender
- Nov 5, 2025
- 4 min read

Introduction
Up to now, I have presented the ETS autoregressive model as a simple alternative for making forecasts based on time series. If the time series exhibits regular trend or seasonality patterns, ETS may be a suitable choice, as it models these components explicitly (to review the components of a time series, click here ).
For time series that do not exhibit such regular trend and seasonality patterns, ARIMA (Auto Regressive Integrated Moving Average) and SARIMA (Seasonal Auto Regressive Integrated Moving Average ) models may be more suitable alternatives . But what exactly are the differences between ETS , ARIMA , and SARIMA models?
Differences between ETS and ARIMA models
ETS (Error, Trend, and Seasonality) models are based on time series decomposition and explicitly model trend and seasonality components in an additive or multiplicative way. They are ideal for series with clear and predefined patterns. ARIMA models, on the other hand, focus on modeling temporal dependencies by transforming the series into a stationary one and exploiting autoregressive and moving average patterns , without assuming a fixed seasonal structure.
Note: A stationary series is one whose statistical properties (mean, variance, autocorrelation) remain constant over time. This means that it does not exhibit marked trends or seasonality, and that fluctuations around the mean are consistent and predictable. In practical terms, a stationary series facilitates modeling and forecasting in time series analysis, as it is assumed that patterns observed in the past will continue to be valid in the future. If the series is not stationary, transformations can be applied to make it stationary.
ARIMA models work by performing the following steps:
Transforming a Time Series into a Stationary Series: ARIMA models assume that the time series is stationary; however, many real-world time series are not naturally stationary. To address this, the " Integrated " component (the letter I in ARIMA) refers to the use of series differentiation (consecutive subtraction of each value by its respective preceding value). This differentiation process can be applied one or more times to remove trends and make the data stationary before modeling it.
Exploring Autoregressive Patterns: The AR (AutoRegressive) part of the model represents the relationship between past values in the series and the current value. This means that the model attempts to predict a future value based on linear combinations of past values. This relationship is captured through autocorrelation, that is, the influence that past values exert on future values.
Use of Moving Averages: The MA (Moving Average) part of the model represents the relationship between past forecast errors and the current value. In other words, it models the patterns of errors made when predicting past values to better adjust future forecasts. The model uses a linear combination of these errors to reduce the variability of the series.
We typically represent a simplified, already trained ARIMA model as ARIMA(p,d,q) , where p , d, and q indicate, respectively, how many previous periods were considered in the AR component ; how many differentiations were performed by the I component ; and how many past errors were considered by the MA component . The determination of the parameters p , d , and q of the ARIMA(p,d,q) model can be done manually; however, there are methods to automate these adjustments, with the auto-arima() method from the forecast package (R language) being the most well-known.
Unlike ETS models, standard ARIMA models do not explicitly include a seasonal component. They are more flexible in adjusting to short-term patterns and can capture implicit seasonality by increasing the model order, but they lack specific seasonal terms that address regular cycles. If the series has a clear seasonal pattern, a SARIMA (Seasonal ARIMA) model may be more suitable, as it adds terms that explicitly capture recurring seasonality. Which leads us to the next item...
Differences between ARIMA and SARIMA
While ARIMA models deal with stationary (or stationary-transformed) time series, SARIMA models extend the approach by incorporating explicit seasonal components. They introduce additional parameters to capture repetitive patterns at regular intervals, such as monthly or annual seasonality, making them more suitable for series with well-defined periodic fluctuations.
SARIMA models are represented as SARIMA(p,d,q)(P,D,Q)m , where the terms (P,D,Q) represent the seasonal equivalents of (p,d,q) , and m indicates the periodicity of the seasonality (e.g., m = 12 for monthly data with annual seasonality). Like ARIMA, SARIMA can be adjusted manually or automatically using methods such as auto.arima() , which can identify both non-seasonal and seasonal components of the model. This approach allows for more efficient capture of seasonal patterns, making SARIMA a powerful option for time series exhibiting well-defined repeating cycles.
Practical example
Would you like to see a practical, commented example with real-world data on the application of ARIMA and SARIMA models? Access this example notebook , developed in Google Colab ( click here to see a previous article where I introduced Google Colab ).
Note that, in the example presented (prediction of gasoline sales in the state of São Paulo), the SARIMA model showed considerably superior performance to the ARIMA model ( SMAPE of 3.81% and 10.58% respectively). This indicates that, for this time series, there is an advantage in considering the presence of an explicit seasonal component.
Note: The SMAPE ( Symmetric Mean Absolute Percentage Error ) metric, used to evaluate the models above, is a variation of the mean absolute percentage error (MAPE) that corrects for skewness problems (it compares predictions that are above and below the actual result). SMAPE is calculated according to the formula below:

References
Hyndman, RJ and Athanasopoulos, G. Forecasting: Principles and practice. OTexts, 2018.
Hyndman, R. J. and Khandakar, Y. (2008). Automatic time series forecasting: The forecast package for R. Journal of Statistical Software, 27(3):1 – 22.
Dritsaki, C., Niklis, D., and Stamatiou, P. Oil consumption forecasting using arima models: an empirical study for Greece. International Journal of Energy Economics and Policy, 11(4):214–224, 2021.
Nielsen, A. (2019). Practical Time Series Analysis: Prediction with Statistics and Machine Learning. O'Reilly Media, Inc.



Comments