|

Exponential Autoregressive Parameters Estimation

Authors: Goryainov V.B., Khing W.M. Published: 08.10.2019
Published in issue: #5(86)/2019  
DOI: 10.18698/1812-3368-2019-5-4-18

 
Category: Mathematics | Chapter: Computational Mathematics  
Keywords: еxponential autoregression, least squares estimate, least absolute deviation estimate

The purpose of the research was to compare the least squares estimatate and the least absolute deviation estimate depending on the probability distribution of the renewal process of the autoregressive equation. To achieve this goal, the sequence of observations of the exponential autoregressive process was repeatedly reproduced using computer simulation, and the least squares estimate and the least absolute deviation estimate were calculated for each sequence. The resulting estimation sequences were used to calculate the sample variances of the least squares estimate and the least absolute deviation estimate. The best estimate was the one with the lowest sample variance. The quantitative measure for the estimates comparison was the sample relative efficiency of estimates, defined as the inverse ratio of their sample variances. Normal distribution, contaminated normal distribution, i.e. Tukey distribution, with different values of the proportion and intensity of contamination, logistic distribution, Laplace distribution and Student distribution with different degrees of freedom, in particular, with one degree of freedom, that is, Cauchy distribution, were used as models of probability distribution of the renewal process. For each probability distribution, asymptotic values of the sample relative efficiency were obtained with an unlimited increase in the sample size of the observations of the autoregressive process. Findings of research show that the least absolute deviation estimate is better than the least squares estimate for Laplace distribution and the contaminated normal distribution with sufficiently large levels of the proportion and intensity of contamination. In other cases, the least squares estimate is preferable

References

[1] Brockwell P.J., Davis R.A. Introduction to time series and forecasting. Springer Texts in Statistics. Cham, Springer, 2016. DOI: https://doi.org/10.1007/978-3-319-29854-2

[2] De Gooijer J.G. Elements of nonlinear time series analysis and forecasting. Springer Texts in Statistics. Cham, Springer, 2017. DOI: https://doi.org/10.1007/978-3-319-43252-6

[3] Ozaki T. Time series modeling of neuroscience data. CRC Press, 2012.

[4] Merzougui M. Estimation in periodic restricted EXPAR(1) models. Comm. Statist. Simulation Comput., 2018, vol. 47, iss. 10, pp. 2819--2828. DOI: https://doi.org/10.1080/03610918.2017.1361975

[5] Merzougui M., Dridi H., Chadli A. Test for periodicity in restrictive EXPAR models. Comm. Statist. Theory Methods, 2016, vol. 45, iss. 9, pp. 2770--2783. DOI: https://doi.org/10.1080/03610926.2014.887110

[6] Olugbode M., El-Masry A., Pointon J. Exchange rate and interest rate exposure of UK industries using first-order autoregressive exponential GARCH-in-mean (EGARCH-M) approach. The Manchester School, 2014, vol. 82, iss. 4, pp. 409--464. DOI: https://doi.org/10.1111/manc.12029

[7] Ghosh H., Gurung B., Gupta P. Fitting EXPAR models through the extended Kalman filter. Sankhya B, 2015, vol. 77, iss. 1, pp. 27--44. DOI: https://doi.org/10.1007/s13571-014-0085-8

[8] Gurung B. An exponential autoregressive (EXPAR) model for the forecasting of all India annual rainfall. Mausam, 2015, vol. 66, no. 4, pp. 847--849.

[9] Goryainov V.B., Goryainova E.R. The influence of anomalous observations on the least squares estimate of the parameter of the autoregressive equation with random coefficient. Herald of the Bauman Moscow State Technical University, Series Natural Sciences, 2016, no. 2, pp. 16--24 (in Russ.). DOI: 10.18698/1812-3368-2016-2-16-24

[10] Goryainova E.R., Botvinkin E.A. Experimental and analytic comparison of the accuracy of different estimates of parameters in a linear regression model. Autom. Remote Control, 2017, vol. 78, iss. 10, pp. 1819--1836. DOI: https://doi.org/10.1134/S000511791710006X

[11] Lehmann E.L., Casella G. Theory of point estimation. Springer Texts in Statistics. New York, NY, Springer, 1998. DOI: https://doi.org/10.1007/b98854

[12] Huber P., Ronchetti E.M. Robust statistics. Wiley, 2009.

[13] Mudrov V.I., Kushko V.L. Metod naimen’shikh moduley [Least absolute deviations method]. Moscow, Znanie Publ., 1971. (in Russ.).

[14] Rhinehart R.R. Nonlinear regression modeling for engineering applications: modeling, model validation, and enabling design of experiments. Wiley, 2016.

[15] Chan K.S., Tong H. On the use of deterministic Lyapunov function for the ergodicity of stochastic difference equations. Adv. Appl. Probab., 1985, vol. 17, iss. 3, pp. 666--678. DOI: https://doi.org/10.2307/1427125

[16] White H. Asymptotic theory for econometricians. Academic Press, 2001.

[17] Goryainov V.B., Goryainova E.R. Robust estimation in threshold autoregression. Herald of the Bauman Moscow State Technical University, Series Natural Sciences, 2017, no. 6, pp. 19--30 (in Russ.). DOI: 10.18698/1812-3368-2017-6-19-30

[18] Shiryaev A.N. Veroyatnost’ [Probability]. Moscow, MTsMNO Publ., 2011.

[19] Billingsley P. Convergence of probability measures. Wiley, 1999.

[20] Magnus J.R., Neudecker H. Matrix differential calculus with applications in statistics and econometrics. Wiley, 2007.