Forecasting Methods and Applications : Spyros G. Makridakis :
Forecasting Methods And Applications 3rd Edition by Spyros Makridakis, Rob J Hyndman, Steven C Enter pincode for exact delivery dates/charges. This item:Forecasting: Methods and Applications by Spyros G. Makridakis Hardcover . in forecasting for the long term, instead of making the book feel dated. Product details. Format Paperback; Dimensions x x mm | g; Publication date 01 Sep ; Publisher Wiley india Pvt. Ltd; Publication.
The inputs are linearly scaled as in the previous methods described. In this respect, Gaussian processes can serve as a nonparametric regression method which assumes an a priori distribution for the input variables provided during training, and then combines them appropriately using a measure of similarity between points the kernel function to predict the future value of the variable of interest.
The kernel function used is the radial basis one, while the initial noise variance and the tolerance of termination was set to 0. The GP method is constructed exploiting the gausspr function of the kernlab R statistical package [ 58 ]. This is done by saving a copy of the previous values of the layer containing the recurrent nodes and using them as an additional input for the next step. In this respect, the network is allowed to exhibit dynamic temporal behavior for a time sequence.
- Follow the Author
- Statistical and Machine Learning forecasting methods: Concerns and ways forward
- Suivre cet auteur
In this study, the model used to implement the RNN is the sequential one. It is comprised of two layers, a hidden one containing recurrent nodes and an output one containing one or more linear nodes.
Due to high computational requirements, we did not use k-fold validation for choosing the optimal network architecture per series but rather three input nodes and six recurrent units, forming the hidden layer, for all the time series of the dataset. The selection was made based on the results of a random sample of series for which this parameterization displayed the best performance. Regarding the rest of the hyper-parameters, a number of epochs was chosen and the learning ratio was set to 0.
Similarly, due to high computational time, the architecture of the model consists of three input nodes, six LSTM units forming the hidden layer and a single linear node in the output layer.
The linear activation function is used before the output of all units and the hard sigmoid one for the recurrent step. Regarding the rest of the hyper-parameters, the rmsprop optimizer was used, a number of epochs was chosen and the learning ratio was set to 0. Other studies however, have concluded the opposite, claiming that without appropriate preprocessing, ML methods may become unstable and yield suboptimal results [ 28 ]. Preprocessing can be applied in three forms: Seasonal adjustments, log or power transformations, and removing the trend.
For instance, [ 11 ] found that MLP cannot capture seasonality adequately, while [ 63 ] claim exactly the opposite. Yet, more empirical results are needed to support the conclusions related to preprocessing, including the most appropriate way to eliminate the trend in the data [ 65 ]. In this study, the following indicative alternatives are used: No pre-processing is applied. The log or the Box-Cox [ 66 ] power transformation is applied to the original data in order to achieve stationarity in the variance.
The data is considered seasonal if a significant autocorrelation coefficient at lag 12 exists. In such case the data is deseasonalized using the classical, multiplicative decomposition approach [ 39 ]. The training of the ML weights, or the optimization of statistical methods, is subsequently done on the seasonally adjusted data. The forecasts obtained are then reseasonalized to determine the final predictions.
This is not done in the case of ETS and ARIMA methods since they include seasonal models, selected using relative tests and information criteria that take care of seasonality and model complexity directly. A Cox-Stuart test [ 67 ] is performed to establish whether a deterministic linear trend should be used, or alternatively first differencing, to eliminate the trend from the data and achieve stationarity in the mean. Combination of the above three: Other estimation methods including the innovations algorithm are provided by itsmr.
The mar1s package handles multiplicative AR 1 with seasonal processes. TSTutorial provides an interactive tutorial for Box-Jenkins modelling.
Transfer function models are provided by the arimax function in the TSA package, and the arfima function in the arfima package. Outlier detection following the Chen-Liu approach is provided by tsoutliers. The tsoutliers and tsclean functions in the forecast package provide some simple heuristic methods for identifying and correcting outliers. Structural models are implemented in StructTS in stats, and in stsm and stsm. Efficient Bayesian inference for nonlinear and non-Gaussian state space models is provided in bssm.
Stochastic volatility models are handled by stochvol in a Bayesian framework. Count time series models are handled in the tscount and acp packages. Censored time series can be modelled using cents and carx. ARCensReg fits univariate censored regression models with autoregressive errors. Portmanteau tests are provided via Box.
Additional tests are given by portes and WeightedPortTest. Change point detection is provided in strucchange using linear regression modelsand in trend using nonparametric tests. The changepoint package provides many popular changepoint methods, and ecp does nonparametric changepoint detection for univariate and multivariate series.
InspectChangepoint uses sparse projection to estimate changepoints in high-dimensional time series. Tests for possibly non-monotonic trends are provided by funtimes. Time series imputation is provided by the imputeTS package. Some more limited facilities are available using na. Forecasts can be combined using ForecastComb which supports many forecast combination methods including simple, geometric and regression-based combinations.
Forecast evaluation is provided in the accuracy function from forecast. Distributional forecast evaluation using scoring rules is available in scoringRules. The Diebold-Mariano test for comparing the forecast accuracy of two models is implemented in the dm. A multivariate version of the Diebold-Mariano test is provided by multDM.
Tidy tools for forecasting are provided by sweepconverting objects produced in forecast to "tidy" data frames. Frequency analysis Spectral density estimation is provided by spectrum in the stats package, including the periodogram, smoothed periodogram and AR estimates. Bayesian spectral inference is provided by bspec. The Lomb-Scargle periodogram for unevenly sampled time series is computed by lomb.
The wavelets package includes computing wavelet filters, wavelet transforms and multiresolution analyses. Wavelet methods for time series analysis based on Percival and Walden are given in wmtsa. WaveletComp provides some tools for wavelet-based analysis of univariate and bivariate time series including cross-wavelets, phase-difference and significance tests.
Tests of white noise using wavelets are provided by hwwntest. Further wavelet methods can be found in the packages brainwaverrwtwaveslimwavethresh and mvcwt. Harmonic regression using Fourier terms is implemented in HarmonicRegression. The forecast package also provides some simple harmonic regression facilities via the fourier function.
Decomposition and Filtering Filters and smoothing: The robfilter package provides several robust time series filters. Seasonal decomposition is discussed below. Autoregressive-based decomposition is provided by ArDec. Additional tools, including ensemble EMD, are available in hht.
An alternative implementation of ensemble EMD and its complete variant are available in Rlibeemd. Enhanced STL decomposition is available in stlplus. Seasonal adjustment of daily time series, allowing for day-of-week, time-of-month, time-of-year and holiday effects is provided by dsa.
Seasonal analysis of health data including regression models, time-stratified case-crossover, plotting functions and residual checks. Seasonal analysis and graphics, especially for climatology. Optimal deseasonalization for geophysical time series using AR fitting. Stationarity, Unit Roots, and Cointegration Stationarity and unit roots: MultipleBubbles tests for the existence of bubbles based on Phillips-Shi-Yu Time series costationarity determination is provided by costat.
LSTS has functions for locally stationary time series analysis. Locally stationary wavelet models for nonstationary time series are implemented in wavethresh including estimation, plotting, and simulation functionality for time-varying spectra.
Forecasting: Methods and Applications: Economics Books @ viajeras.info
The Engle-Granger two-step method with the Phillips-Ouliaris cointegration test is implemented in tseries and urca. The latter additionally contains functionality for the Johansen trace and lambda-max tests. CommonTrend provides tools to extract and plot common trends from a cointegration system. Parameter estimation and inference in a cointegrating regression are implemented in cointReg. Nonlinear Time Series Analysis Nonlinear autoregression: Tools for nonlinear time series analysis are provided in NTS including threshold autoregressive models, Markov-switching models, convolutional functional autoregressive models, and nonlinearity tests.
Neural network autoregression is also provided in GMDH. NlinTS includes neural network VAR, and a nonlinear version of the Granger causality test based on feedforward neural networks. Autoregression Markov switching models are provided in MSwMwhile dependent mixtures of latent Markov models are given in depmix and depmixS4 for categorical and continuous time series. Various tests for nonlinearity are provided in fNonlinear.
Additional functions for nonlinear time series are available in nlts and nonlinearTseries. Fractal time series modeling and analysis is provided by fractal.
CRAN Task View: Time Series Analysis
Entropy Shannon entropy based on the spectral density is computed using ForeCA. RTransferEntropy measures information flow between time series with Shannon and Renyi transfer entropy. An entropy measure based on the Bhattacharya-Hellinger-Matusita distance is implemented in tseriesEntropy.
Various approximate and sample entropies are computed using TSEntropies. Dynamic Regression Models Dynamic linear models: A convenient interface for fitting dynamic regression models via OLS is available in dynlm ; an enhanced approach that also works with other regression functions and more time series classes is implemented in dyn.
More advanced dynamic system equations can be fitted using dse. Functions for distributed lag nonlinear modelling are provided in dlnm. Time-varying parameter models can be fitted using the tpr package. Dynamic modeling of various kinds is available in dynr including discrete and continuous time, linear and nonlinear models, and different types of latent variables.
These models are restricted to be stationary. Automated VAR models and networks are available in autovarCore. Another implementation with bootstrapped prediction intervals is given in VAR.
EvalEst facilitates Monte Carlo experiments to evaluate the associated estimation methods. Vector error correction models are available via the urcaecmvarstsDyn packages, including versions with structural constraints and thresholding. Time series component analysis: Time series factor analysis is provided in tsfa.
ForeCA implements forecastable component analysis by searching for the best linear transformations that make a multivariate time series as forecastable as possible. PCA4TS finds a linear transformation of a multivariate time series giving lower-dimensional subseries that are uncorrelated with each other.
Follow the Author
One-sided dynamic principal components are computed in odpc. Frequency-domain-based dynamic PCA is implemented in freqdom.
Multivariate state space models An implementation is provided by the KFAS package which provides a fast multivariate Kalman filter, smoother, simulation smoother and forecasting.
FKF provides a fast and flexible implementation of the Kalman filter, which can deal with missing values. Yet another implementation is given in the dlm package which also contains tools for converting other multivariate models into state space form. All of these packages assume the observational and state error terms are uncorrelated. Partially-observed Markov processes are a generalization of the usual linear multivariate state space models, allowing non-Gaussian and nonlinear models.
These are implemented in the pomp package.