• Yicun Ouyang

Student thesis: Phd


Time series modelling and forecasting plays a critical role in various practical fields, thus attracting a large amount of attention and researches. To improve the accuracy and efficiency of modelling and forecasting, various econometric models, neural networks and other methods have been introduced and developed in the past decades. Among various existing neural networks architectures and learning algorithms, self-organising map (SOM) is one of the most popular neural network models. A recently developed model, termed self-organising mixture autoregressive (SOMAR), combines SOM and mixture autoregressive (MAR) and employs mixture autoregressive local models to describe the non-stationary time series. As the fixed structure of neurons in SOMAR deteriorates its prediction accuracy and practical applications, neural gas mixture autoregressive (NGMAR), has been introduced in this thesis as an appropriate solution to this problem. It organises the neurons in a more flexible neural network, neural gas. While still retaining the features of SOMAR, the proposed network defines the neighbourhood of neurons as a 'gas' based on their sum of autocorrelation (SAC) rankings and updates the reference vectors accordingly. It develops the performance of SOMAR further. As causal systems, most neural networks including SOMAR and NGMAR, do not take the future expectation's influence on the current value into account while training and predicting. This problem could be solved by applying the non-causal concept to these networks. By allowing dependence of future expectations, the non-causal NGMAR model employs both past and future expectations together for training and predicting. Moreover, varied weightings based on the time lags of these values have been assigned to enhance the influences of the more recent values. These properties make the non-causal NGMAR perform predictions more accurately. This thesis has also generalised the one-step prediction performance of selforganising network in multistep scale. The extended self-organising neural network builds input segments with various lengths and only updating corresponding parts of reference vectors, which makes the network capable of learning multiple regressive models for forecasting various horizons. The inter dependencies among future points are preserved and this results in all forecasting tasks naturally. All the proposed models are applied to both benchmark and financial datasets. To evaluate their forecasting accuracy, NRMSE (normalised root mean squared error), correct prediction percentage (CPP) and their corresponding t-test results are calculated. It is shown that the proposed models yield significantly better prediction performance than popular economic time series models and other neural networks.
Date of Award1 Aug 2016
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorHujun Yin (Supervisor)

Cite this