21210457 - Statistical methods for econometrics and finance

The course aims to introduce the main techniques of econometrics, the use of which has become common practice in empirical work in many areas of economic, financial and business analysis. The focus is on the intuition behind the different approaches and their practical relevance. The course introduces and discusses empirical examples and applications from areas of analysis such as labour economics, finance, international economics, environmental economics, macroeconomics and management. The use of the different procedures is illustrated by practical examples based on the use of data taken from real cases, with the use of a suitable software (e-views, r).

Curriculum

teacher profile | teaching materials

Programme

The course introduces the main tools of statistical modeling applied to economics and finance. Following brief reviews of matrix algebra and estimation theory, the classical linear regression model is presented, with particular emphasis on the model assumptions, estimation by ordinary least squares (OLS), and the Gauss–Markov theorem.
Maximum likelihood estimation, statistical tests on parameters, and methods for addressing violations of classical regression assumptions are then discussed, including heteroskedasticity, autocorrelation, multicollinearity, and measurement error. Generalized least squares, diagnostic tests for detecting structural violations, and corrective techniques—including instrumental variables estimators—are introduced.
The course then focuses on linear forecasting, model misspecification, and the use of dummy variables for analyzing structural stability. Measures of goodness of fit and criteria for model comparison (e.g. R², AIC, BIC) are examined.
The second part of the course introduces the temporal dimension of observations and deals with distributed lag models (DLMs), panel data models, and the ARIMA class of time series models. For DLMs, estimation based on the Almon polynomial is presented. For panel data, fixed effects and random effects models are discussed. The time series component of the course covers descriptive analysis, structural components (trend, cycle, seasonality), and stochastic models (AR, MA, ARIMA).
1. Preliminary theoretical review: Matrix algebra: operations, rank, determinant, inverse, eigenvalues and eigenvectors, vec operator, Kronecker product. Estimation theory: unbiasedness, consistency, efficiency, method of moments. Multivariate probability and key distributions (multivariate normal).
2. Classical linear regression: Linear model formulation; assumptions for errors and regressors; OLS estimation; properties of OLS estimators; Gauss–Markov theorem; variance–covariance matrix of the estimator; confidence intervals; t and F tests.
3. Alternative estimation and model verification: Maximum likelihood estimation; Fisher information matrix. Multicollinearity: diagnosis, consequences for parameter estimates, possible solutions. Heteroskedasticity: diagnosis (Breusch–Pagan, White tests), consequences, corrections and GLS estimators. Error autocorrelation: diagnosis (Durbin–Watson, Breusch–Godfrey tests), consequences, possible solutions. Measurement error; specification tests (Ramsey RESET). Instrumental variables and IV/2SLS estimators.
4. Variable selection and structural stability: Consequences of excluding relevant variables and including redundant variables. Model selection criteria; dimension reduction. Automated procedures: forward selection, backward elimination, stepwise selection. Model uncertainty and overfitting. Structural stability of the regression function: graphical analysis, Chow test; inclusion of dummy variables and interpretation.
5. Regression forecasting and diagnostics: Linear forecasting and properties of forecast errors. Forecast error measures: MSE, RMSE, MAE. Goodness-of-fit measures: R², adjusted R². Model comparison criteria: AIC, BIC.
6. Models for temporal observations: Distributed lag models: structure of variables; dynamic nature of parameters; polynomial approximation and Almon polynomials. Panel data models: structure of panel data; fixed effects and random effects models; Within and GLS estimators; Hausman test; autocorrelation and heteroskedasticity in panel data; introduction to dynamic panel models. Time series analysis: structural components (trend, cycle, seasonality); moving averages and differencing; autocovariance and autocorrelation; autocorrelation function (ACF); partial autocorrelation function (PACF); correlogram; tests for stationarity (unit roots, Dickey–Fuller test, ADF test, KPSS test). Stochastic models: AR, MA, ARMA; stationarity and invertibility conditions; model identification (Box–Jenkins).


Core Documentation

- Stock, J. H., & Watson, M. W. (2015). Introduzione all’econometria (3rd ed.). Pearson, Milan.
- Verbeek, M. (2021). Econometria (6th ed.). Zanichelli, Bologna.
- Wooldridge, J. M. (2019). Introductory Econometrics: A Modern Approach (7th ed.). Cengage Learning.
- Lecturer’s notes.




Attendance

Classroom lessons according to the schedule set by the School of Economics and Business Studies

Type of evaluation

Oral exam on the course topics

teacher profile | teaching materials

Programme

The course introduces the main tools of statistical modeling applied to economics and finance. Following brief reviews of matrix algebra and estimation theory, the classical linear regression model is presented, with particular emphasis on the model assumptions, estimation by ordinary least squares (OLS), and the Gauss–Markov theorem.
Maximum likelihood estimation, statistical tests on parameters, and methods for addressing violations of classical regression assumptions are then discussed, including heteroskedasticity, autocorrelation, multicollinearity, and measurement error. Generalized least squares, diagnostic tests for detecting structural violations, and corrective techniques—including instrumental variables estimators—are introduced.
The course then focuses on linear forecasting, model misspecification, and the use of dummy variables for analyzing structural stability. Measures of goodness of fit and criteria for model comparison (e.g. R², AIC, BIC) are examined.
The second part of the course introduces the temporal dimension of observations and deals with distributed lag models (DLMs), panel data models, and the ARIMA class of time series models. For DLMs, estimation based on the Almon polynomial is presented. For panel data, fixed effects and random effects models are discussed. The time series component of the course covers descriptive analysis, structural components (trend, cycle, seasonality), and stochastic models (AR, MA, ARIMA).
1. Preliminary theoretical review: Matrix algebra: operations, rank, determinant, inverse, eigenvalues and eigenvectors, vec operator, Kronecker product. Estimation theory: unbiasedness, consistency, efficiency, method of moments. Multivariate probability and key distributions (multivariate normal).
2. Classical linear regression: Linear model formulation; assumptions for errors and regressors; OLS estimation; properties of OLS estimators; Gauss–Markov theorem; variance–covariance matrix of the estimator; confidence intervals; t and F tests.
3. Alternative estimation and model verification: Maximum likelihood estimation; Fisher information matrix. Multicollinearity: diagnosis, consequences for parameter estimates, possible solutions. Heteroskedasticity: diagnosis (Breusch–Pagan, White tests), consequences, corrections and GLS estimators. Error autocorrelation: diagnosis (Durbin–Watson, Breusch–Godfrey tests), consequences, possible solutions. Measurement error; specification tests (Ramsey RESET). Instrumental variables and IV/2SLS estimators.
4. Variable selection and structural stability: Consequences of excluding relevant variables and including redundant variables. Model selection criteria; dimension reduction. Automated procedures: forward selection, backward elimination, stepwise selection. Model uncertainty and overfitting. Structural stability of the regression function: graphical analysis, Chow test; inclusion of dummy variables and interpretation.
5. Regression forecasting and diagnostics: Linear forecasting and properties of forecast errors. Forecast error measures: MSE, RMSE, MAE. Goodness-of-fit measures: R², adjusted R². Model comparison criteria: AIC, BIC.
6. Models for temporal observations: Distributed lag models: structure of variables; dynamic nature of parameters; polynomial approximation and Almon polynomials. Panel data models: structure of panel data; fixed effects and random effects models; Within and GLS estimators; Hausman test; autocorrelation and heteroskedasticity in panel data; introduction to dynamic panel models. Time series analysis: structural components (trend, cycle, seasonality); moving averages and differencing; autocovariance and autocorrelation; autocorrelation function (ACF); partial autocorrelation function (PACF); correlogram; tests for stationarity (unit roots, Dickey–Fuller test, ADF test, KPSS test). Stochastic models: AR, MA, ARMA; stationarity and invertibility conditions; model identification (Box–Jenkins).


Core Documentation

- Stock, J. H., & Watson, M. W. (2015). Introduzione all’econometria (3rd ed.). Pearson, Milan.
- Verbeek, M. (2021). Econometria (6th ed.). Zanichelli, Bologna.
- Wooldridge, J. M. (2019). Introductory Econometrics: A Modern Approach (7th ed.). Cengage Learning.
- Lecturer’s notes.




Attendance

Classroom lessons according to the schedule set by the School of Economics and Business Studies

Type of evaluation

Oral exam on the course topics

teacher profile | teaching materials

Programme

The course introduces the main tools of statistical modeling applied to economics and finance. Following brief reviews of matrix algebra and estimation theory, the classical linear regression model is presented, with particular emphasis on the model assumptions, estimation by ordinary least squares (OLS), and the Gauss–Markov theorem.
Maximum likelihood estimation, statistical tests on parameters, and methods for addressing violations of classical regression assumptions are then discussed, including heteroskedasticity, autocorrelation, multicollinearity, and measurement error. Generalized least squares, diagnostic tests for detecting structural violations, and corrective techniques—including instrumental variables estimators—are introduced.
The course then focuses on linear forecasting, model misspecification, and the use of dummy variables for analyzing structural stability. Measures of goodness of fit and criteria for model comparison (e.g. R², AIC, BIC) are examined.
The second part of the course introduces the temporal dimension of observations and deals with distributed lag models (DLMs), panel data models, and the ARIMA class of time series models. For DLMs, estimation based on the Almon polynomial is presented. For panel data, fixed effects and random effects models are discussed. The time series component of the course covers descriptive analysis, structural components (trend, cycle, seasonality), and stochastic models (AR, MA, ARIMA).
1. Preliminary theoretical review: Matrix algebra: operations, rank, determinant, inverse, eigenvalues and eigenvectors, vec operator, Kronecker product. Estimation theory: unbiasedness, consistency, efficiency, method of moments. Multivariate probability and key distributions (multivariate normal).
2. Classical linear regression: Linear model formulation; assumptions for errors and regressors; OLS estimation; properties of OLS estimators; Gauss–Markov theorem; variance–covariance matrix of the estimator; confidence intervals; t and F tests.
3. Alternative estimation and model verification: Maximum likelihood estimation; Fisher information matrix. Multicollinearity: diagnosis, consequences for parameter estimates, possible solutions. Heteroskedasticity: diagnosis (Breusch–Pagan, White tests), consequences, corrections and GLS estimators. Error autocorrelation: diagnosis (Durbin–Watson, Breusch–Godfrey tests), consequences, possible solutions. Measurement error; specification tests (Ramsey RESET). Instrumental variables and IV/2SLS estimators.
4. Variable selection and structural stability: Consequences of excluding relevant variables and including redundant variables. Model selection criteria; dimension reduction. Automated procedures: forward selection, backward elimination, stepwise selection. Model uncertainty and overfitting. Structural stability of the regression function: graphical analysis, Chow test; inclusion of dummy variables and interpretation.
5. Regression forecasting and diagnostics: Linear forecasting and properties of forecast errors. Forecast error measures: MSE, RMSE, MAE. Goodness-of-fit measures: R², adjusted R². Model comparison criteria: AIC, BIC.
6. Models for temporal observations: Distributed lag models: structure of variables; dynamic nature of parameters; polynomial approximation and Almon polynomials. Panel data models: structure of panel data; fixed effects and random effects models; Within and GLS estimators; Hausman test; autocorrelation and heteroskedasticity in panel data; introduction to dynamic panel models. Time series analysis: structural components (trend, cycle, seasonality); moving averages and differencing; autocovariance and autocorrelation; autocorrelation function (ACF); partial autocorrelation function (PACF); correlogram; tests for stationarity (unit roots, Dickey–Fuller test, ADF test, KPSS test). Stochastic models: AR, MA, ARMA; stationarity and invertibility conditions; model identification (Box–Jenkins).


Core Documentation

- Stock, J. H., & Watson, M. W. (2015). Introduzione all’econometria (3rd ed.). Pearson, Milan.
- Verbeek, M. (2021). Econometria (6th ed.). Zanichelli, Bologna.
- Wooldridge, J. M. (2019). Introductory Econometrics: A Modern Approach (7th ed.). Cengage Learning.
- Lecturer’s notes.




Attendance

Classroom lessons according to the schedule set by the School of Economics and Business Studies

Type of evaluation

Oral exam on the course topics