Browsing by Author "Guo, Renkuan"
Now showing 1 - 20 of 22
Results Per Page
Sort Options
- ItemOpen AccessAccurate portfolio risk-return structure modelling(2006) Hossain, Nafees; Troskie, Casper G; Guo, RenkuanMarkowitz's modem portfolio theory has played a vital role in investment portfolio management, which is constantly pushing the development on volatility models. Particularly, the stochastic volatility model which reveals the dynamics of conditional volatility. Financial time series and volatility models has become one of the hot spots in operations research. In this thesis, one of the areas we explore is the theoretical formulation of the optimal portfolio selection problem under Ito calculus framework. Particularly, a stochastic variation calculus problem, i.e., seeking the optimal stochastic volatility diffusion family for facilitating the best portfolio selection identified under the continuous-time stochastic optimal control theoretical settings. One of the properties this study examines is the left-shifting role of the GARCH(1, 1) (General Autoregressive Conditional Heteroskedastic) model's efficient frontier. This study considers many instances where the left shifting superior behaviour of the GARCH(1, 1) is observed. One such instance is when GARCH(1, 1) is compared within the volatility modelling extensions of the GARCH environ in a single index framework. This study will demonstrate the persistence of the superiority of the G ARCH ( 1, 1) frontier within a multiple and single index context of modem portfolio theory. Many portfolio optimization models are investigated, particularly the Markowitz model and the Sharpe Multiple and Single index models. Includes bibliographical references (p. 313-323).
- ItemOpen AccessAn alternative model for multivariate stable distributions(2009) Jama, Siphamandla; Guo, RenkuanAs the title, "An Alternative Model for Multivariate Stable Distributions", depicts, this thesis draws from the methodology of [J36] and derives an alternative to the sub-Gaussian alpha-stable distribution as another model for multivariate stable data without using the spectral measure as a dependence structure. From our investigation, firstly, we echo that the assumption of "Gaussianity" must be rejected, as a model for, particularly, high frequency financial data based on evidence from the Johannesburg Stock Exchange (JSE). Secondly, the introduced technique adequately models bivariate return data far better than the Gaussian model. We argue that unlike the sub-Gaussian stable and the model involving a spectral measure this technique is not subject to estimation of a joint index of stability, as such it may remain a superior alternative in empirical stable distribution theory. Thirdly, we confirm that the Gaussian Value-at-Risk and Conditional Value-at-Risk measures are more optimistic and misleading while their stable counterparts are more informative and reasonable. Fourthly, our results confirm that stable distributions are more appropriate for portfolio optimization than the Gaussian framework.
- ItemOpen AccessContributions to statistical machine learning algorithm(2011) Cui, Yan Hong; Guo, RenkuanThis thesis's research focus is on computational statistics along with DEAR (abbreviation of differential equation associated regression) model direction, and that in mind, the journal papers are written as contributions to statistical machine learning algorithm literature.
- ItemOpen AccessCumulative sum quality control charts design and applications(2006) Kesupile, Galeboe; Guo, RenkuanClassical Statistical Process Control Charts are essential in Statistical Control exercises and thus constantly obtained attention for quality improvements. However, the establishment of control charts requires large-sample data (say, no less than I 000 data points). On the other hand, we notice that the small-sample based Grey System Theory Approach is well-established and applied in many areas: social, economic, industrial, military and scientific research fields. In this research, the short time trend curve in terms of GM( I, I) model will be merged into Shewhart and CU SUM two-sided version control charts and establish Grey Predictive Shewhart Control chart and Grey Predictive CUSUM control chart. On the other hand the GM(2, I) model is briefly checked its of how accurate it could be as compared to GM( I, 1) model in control charts. Industrial process data collected from TBF Packaging Machine Company in Taiwan was analyzed in terms of these new developments as an illustrative example for grey quality control charts.
- ItemOpen AccessEmpirical evidences of coherent market hypothesis(2002) Kao, Peter Ta-Chao; Guo, RenkuanIn this dissertation, empirical explorations of basic properties of the CMH-based returns distribution will be conducted on the Johannesburg Stock Exchange. This is followed by a the-oretical exploraion of the stochastic differential equations that governs the underlying market dynamics.
- ItemOpen AccessEmpirical evidences of stock split market effects(2011) Mhuru, Trust Taruona; Guo, RenkuanUnder normal financial market circumstances (i.e., not under the shadow of financial crisis) it iscommon to believe that buying shares from large institutions leads to high profit. This is becausethe shares are of high trading value due to the solid financial foundation and superiorperformances of large institutions or companies. In contrast to these traders' belief, largecompanies often exercise "stock split" to strengthen the confidence on the company and encourage more investments in the company. A "stock split" increases the number of shares outstanding without increasing the company's capital. A conjecture is that a "stock split" action will increase the market liquidity because of the price decrease of each share; consequently, market trading activities would be intensifying such that log-return will be higher and the volatility also higher accordingly. The financial market literature shows that the impacts of "stock split" were controversial. In other words, the influences on the market of "stock split" did not always behave as the management expected. In this thesis, we intend to use limited available stock split data from NASDAQ to explore some empirical evidences on the impacts of "stock split". We also propose a DEAR-based trend analysis in log-return and market volatility measured by daily trading range for technical analysis on "stock split" impacts.
- ItemOpen AccessEmpirical modelling of high-frequency foreign exchange rates(2004) Packirisamy, Someshini; Guo, RenkuanThere is a wealth of information available on modelling foreign exchange time series data, however, research studies on modelling and predicting high frequency foreign exchange data is less prominent. Furthermore, there does not appear to be much evidence supporting work on the modelling and prediction of high frequency South African Rand/United States Dollar (ZAR/USD) exchange rates. A fair amount of noise is embedded in high frequency time series data, especially the ZAR/USD exchange rates, and the modelling of these time series requires the use of specialized models. In addition, lengthy high frequency foreign exchange data is largely unavailable for the South African market. This dissertation undertakes empirical explorations to model high frequency foreign exchange time series (primarily the ZAR/USD time series), through the use of multi-agent neural networks, linear Kalman filters and fuzzy Markov chain theory.
- ItemOpen AccessEmpirical statistical modelling for crop yields predictions: bayesian and uncertainty approaches(2015) Adeyemi, Rasheed Alani; Guo, Renkuan; Dunne, TimThis thesis explores uncertainty statistics to model agricultural crop yields, in a situation where there are neither sampling observations nor historical record. The Bayesian approach to a linear regression model is useful for predict ion of crop yield when there are quantity data issue s and the model structure uncertainty and the regression model involves a large number of explanatory variables. Data quantity issues might occur when a farmer is cultivating a new crop variety, moving to a new farming location or when introducing a new farming technology, where the situation may warrant a change in the current farming practice. The first part of this thesis involved the collection of data from experts' domain and the elicitation of the probability distributions. Uncertainty statistics, the foundation of uncertainty theory and the data gathering procedures were discussed in detail. We proposed an estimation procedure for the estimation of uncertainty distributions. The procedure was then implemented on agricultural data to fit some uncertainty distributions to five cereal crop yields. A Delphi method was introduced and used to fit uncertainty distributions for multiple experts' data of sesame seed yield. The thesis defined an uncertainty distance and derived a distance for a difference between two uncertainty distributions. We lastly estimated the distance between a hypothesized distribution and an uncertainty normal distribution. Although, the applicability of uncertainty statistics is limited to one sample model, the approach provides a fast approach to establish a standard for process parameters. Where no sampling observation exists or it is very expensive to acquire, the approach provides an opportunity to engage experts and come up with a model for guiding decision making. In the second part, we fitted a full dataset obtained from an agricultural survey of small-scale farmers to a linear regression model using direct Markov Chain Monte Carlo (MCMC), Bayesian estimation (with uniform prior) and maximum likelihood estimation (MLE) method. The results obtained from the three procedures yielded similar mean estimates, but the credible intervals were found to be narrower in Bayesian estimates than confidence intervals in MLE method. The predictive outcome of the estimated model was then assessed using simulated data for a set of covariates. Furthermore, the dataset was then randomly split into two data sets. The informative prior was later estimated from one-half called the "old data" using Ordinary Least Squares (OLS) method. Three models were then fitted onto the second half called the "new data": General Linear Model (GLM) (M1), Bayesian model with a non-informative prior (M2) and Bayesian model with informative prior (M3). A leave-one-outcross validation (LOOCV) method was used to compare the predictive performance of these models. It was found that the Bayesian models showed better predictive performance than M1. M3 (with a prior) had moderate average Cross Validation (CV) error and Cross Validation (CV) standard error. GLM performed worst with least average CV error and highest (CV) standard error among the models. In Model M3 (expert prior), the predictor variables were found to be significant at 95% credible intervals. In contrast, most variables were not significant under models M1 and M2. Also, The model with informative prior had narrower credible intervals compared to the non-information prior and GLM model. The results indicated that variability and uncertainty in the data was reasonably reduced due to the incorporation of expert prior / information prior. We lastly investigated the residual plots of these models to assess their prediction performance. Bayesian Model Average (BMA) was later introduced to address the issue of model structure uncertainty of a single model. BMA allows the computation of weighted average over possible model combinations of predictors. An approximate AIC weight was then proposed for model selection instead of frequentist alternative hypothesis testing (or models comparison in a set of competing candidate models). The method is flexible and easy to interpret instead of raw AIC or Bayesian information criterion (BIC), which approximates the Bayes factor. Zellner's g-prior was considered appropriate as it has widely been used in linear models. It preserves the correlation structure among predictors in its prior covariance. The method also yields closed-form marginal likelihoods which lead to huge computational savings by avoiding sampling in the parameter space as in BMA. We lastly determined a single optimal model from all possible combination of models and also computed the log-likelihood of each model.
- ItemOpen AccessForecasting stock price movements using neural networks(2006) Rank, Christian; Guo, RenkuanThe prediction of security prices has shown to be one of the most important but most difficult tasks in financial operations. Linear approaches failed to model the non-linear behaviour of markets and non-linear approaches turned out to posses too many constraints. Neural networks seem to be a suitable method to overcome these problems since they provide algorithms which process large sets of data from a non-linear context and yield thorough results. The first problem addressed by this research paper is the applicability of neural networks with respect to markets as a tool for pattern recognition. It will be shown that markets posses the necessary requirements for the use of neural networks, i.e. markets show patterns which are exploitable.
- ItemOpen AccessFuzzy modelling of the Johannesburg Security Exchange overall index(2002) Musongole, Chibelushi Maxwell C; Guo, RenkuanThis thesis focuses on the empirical analysis of the fuzzy feature of the Johannesburg Stock Exchange Overall Index using fuzzy logic techniques. The data for the periods 1985 - 2001 is used in the analysis. Description of the fuzzy feature is crucial to the proper understanding of the movement of the JSE Overall Index and the South African economy. The fuzzy feature of the Johannesburg Security Exchange Overall Index if understood would impact on the financial and economical decisions. A preliminary Fractal analysis is carried out before the fuzzy analysis to investigate the nature of the Johannesburg Security Exchange Overall Index. The Johannesburg Security Exchange Overall Index experiences the Hurst phenomena of long memory for periods of 100 days (approximately three months). Outside the long memory periods, the Johannesburg Security Exchange Overall Index is found to exhibit antipersistent or short-range dependency characteristics. The fuzzy feature of the Johannesburg Security Exchange Overall Index is described by many aspects of fuzzy logic. The analysis of the fuzzy feature is carried out according to time periods of approximately 4 years each of the Johannesburg Security Exchange Overall Index. The index in each time period is partitioned in three fuzzy states: "low", "middle" and "high". The fuzzy states are important in assessing the fuzzy nature of the Johannesburg Security Exchange Overall Index. The partitioning reveals that the fuzzy states of the Johannesburg Security Exchange Overall Index do not possess sharp boundaries. The sizes of the fuzzy states are found to change with time. This reflects changes in the forces behind the dynamics of the index.
- ItemOpen AccessGrey diffenrential equation modeling on stock prices(2005) Xue, Qifeng; Guo, RenkuanIncludes bibliographical references (leaves 110-111).
- ItemOpen AccessInterval AR(1) modelling of South African stock market prices(2005) Biyana, Mahlubandile Dugmore; Guo, RenkuanIncludes bibliographical references (leaves 124-126).
- ItemOpen AccessInterval-valued Uncertainty Capability Indices with South African Industrial Applications(2014) Gyekye, Kwame Boakye; Guo, RenkuanSince the advent of statistical quality control and process capability analysis, its study and application has gained tremendous attention both in academia and industry. This attention is due to its ability to describe the capability of a complex process adequately, simply (i.e. using a unitless index) and also in some instances to compare different manufacturing processes. However, the application of statistical quality control has come under intense criticism, notably in one car manufacturing industry where the actual number of non-conforming units considerably exceeded expectation, although probabilistic control measures were in place. This failure led to a large recall of their vehicles and also left a dent on the image of the company. One of the reasons for this unfortunate instance is that in classical quality control measures, human judgement is ignored and since in process engineering there is considerable expert intuition in decision making, this element cannot be undermined. Hence the research study applies the uncertainty theory proposed by Baoding Liu (2007) to enable us to incorporate human judgement into process capability analysis. The major findings of the thesis is that the uncertain process capability indices under an uncertainty environment are interval-valued and their relevant characteristics. The study further developed the "sampling" uncertainty distributions and thus the "sampling" impacts on the newly defined uncertain process capability indices under Liu's uncertain normal distribution assumptions. In order to reach the main purpose of the thesis, a thoroughgoing literature review on probabilistic process capability indices is necessary.
- ItemOpen AccessModelling of volatility of stock prices using GARCH models & its importance in portfolio construction(2009) Mtemeri, Tinotenda; Guo, RenkuanThis thesis is aimed at investigating the possibility to model the risk of stocks in financial markets and evaluating the adequacy and effectiveness of univariate GARCH models such as the symmetric GARCH and a few other variations such as the EGARCH, TARCH and PARCH in modelling volatility in monthly returns of stocks traded on the Johannesburg Stock Exchange. This is further used to investigate the importance of GARCH modelling in portfolio construction using Improved Sharpe Single Index Models. The data used for model estimation has been randomly selected from different sectors of the South African economy. GARCH models are estimated and validated for the data series of the randomly selected 15 JSE stocks. Conclusions are drawn regarding the different GARCH models, best lag structure and best error distributions for modelling. The GARCH (1,1) model demonstrates a relatively good forecasting performance as far as the short term forecasting horizon is concerned. However, the use of alternatives to the more common GARCH (1,1) and use of non-normal distributions is not clearly supported. Also, the use of higher order GARCH models such as the GARCH (1,2), GARCH (2,1) and GARCH (2,2) is not clearly supported and the GARCH (1, 1) remains superior overall to these models. The results obtained from this thesis are of paramount importance in portfolio construction, option pricing and formulating hedging strategies. An illustration of the importance of the G ARCH (1,1) model in portfolio construction is given and conclusions are drawn regarding its usefulness in improving our volatility estimations for purposes of portfolio construction.
- ItemOpen AccessOptimal liquidation strategies(2006) Ennis, Michael; Maritz, EJ; Guo, RenkuanLiquidation strategies consider the problem of minimising transaction costs occurring in a portfolio liquidation. Transaction costs are the difference between current market value and the realised value after the liquidation. A strategy to follow to perform a liquidation is especially important to institutional investors due the large size of their trades. Large trades can have a significant effect on the price of a security which can impact the realised returns of the liquidation. These models solve for trading trajectories that maximise this. The models investigated do this in a mean-variance framework where the expected return of the strategy is constrained by its variance and the investors risk preference. Parameters used in liquidity functions are estimated for securities on the South African JSE Securities Exchange. The effects of security liquidity, volatility, stock correlation and length of liquidation horizon on the optimal strategy are investigated. There is little or no existing literature that attempts to model these functions in the South African market. Due to the smaller size of the South African market as well as the number of thinly traded shares compared to most markets studied in the literature, many securities are highly illiquid. We investigate relationships between firm size and daily traded value and these liquidity parameters. General rules are presented to help traders improve a liquidation strategy without the need to estimate all parameters needed to calculate an optimal strategy using one of these models.
- ItemOpen AccessOption Pricing models with Stochastic Volatility and Jumps(2009) Kalsheker, Farhan; Guo, RenkuanExotic equity options are specialized instruments which are typically traded over the counter. Their prices are primarily determined by option pricing models which should be able to price exotic options consistently with the market prices of corresponding vanilla options. Additionally, option pricing models should have intuitive dynamics which are able to capture real world behavior (such as stochastic volatility effects and jumps in the price of the underlying). This dissertation tackles the question of which option pricing model to use; it compares diffusion, pure jump and jump-diffusion models. All models are fitted to one-day price data on S&P500 European vanilla options; the models with the best fit exhibit the smallest error in pricing between model prices and market prices. The stochastic volatility with jumps (SVJ) models are found to perform the best. The SVJ-DE model, a new variant of this type of model (which is based on Heston-type stochastic volatility and Kou-type double exponential jumps in the log price), is presented and tested. The Heston SV model is ranked third best. There is a significant performance gap between the SV/SVJ models and the remaining models. The variance-gamma model with stochastic time is found to be the best performing model from the pure jump and simple jump-diffusion categories. The Kou jump-diffusion model with double exponential jumps and constant diffusion volatility ranks next, followed by the Merton jump-diffusion model and the variance-gamma pure jump model. On comparison of model and market implied volatility surfaces, the pure jump and simple jump-diffusion models are found to be efficient at generating volatility smile effects, but not volatility skew effects. The converse holds for the Heston SV model. The SVJ models exploit this behavior in an attempt to use the jump component to generate the smile effects on the short end of the volatility surface and the stochastic volatility diffusion component to generate the skew effects on the long end of the volatility surface. The application of the SV and SVJ models is demonstrated by computing the prices of barrier options via Monte Carlo simulation. Both of the SVJ models give similar barrier option prices. Diffusion processes and jump processes are the two main building blocks of any option pricing model. This research finds that simple jump-diffusion models and pure jump models are unable to demonstrate good performance when fitting to a complete grid of market option prices. The Heston stochastic volatility pure diffusion model gives better performance compared to these jump models. The SVJ models which have both a stochastic volatility diffusion component and a jump component are found to give the best performance. The SVJ-DE model has the added advantage of being able to generate upward and downward jumps from different exponential distributions, versus the Bates model which generates jumps from a normal distribution.
- ItemOpen AccessOption pricing using hidden Markov models(2006) Anderson, Michael; Guo, RenkuanThis work will present an option pricing model that accommodates parameters that vary over time, whilst still retaining a closed-form expression for option prices: the Hidden Markov Option Pricing Model. This is possible due to the macro-structure of this model and provides the added advantage of ensuring efficient computation of option prices. This model turns out to be a very natural extension to the Black-Scholes model, allowing for time-varying input parameters.
- ItemOpen AccessPricing options in a fuzzy environment(2008) Ramsden, Bevan; Guo, RenkuanAlthough Fuzzy Logic is not new, it is however only since 2004 that an axiomatic theory has been created that has all the desirable effects of Fuzzy Logic. This theory, named Credibility theory was proposed by Dr. Liu. Within this thesis we aim to utilize credibility theory to model the psychological impacts of market participants on European options. Specifically this is done by modifying the approach that was originally taken by Black and Scholes. The Hew model, which is known as the fuzzy drift parameter model, begins by replacing the deterministic drift within Brownian motion with a fuzzy parameter. This fuzzy parameter models the psychological impacts of market participants. Naturally as we are dealing in Chance theory 1 the risk neutral dynamics change from that of Black and Scholes and thus so does the price of European call options.
- ItemOpen AccessQuality control charts under random fuzzy measurements(2007) Thoutou, Sayi Mbani; Guo, RenkuanWe consider statistical process control charts as tools that statistical process control utilizes for monitoring changes; identifying process variations and their causes in industrial processes (manufacturing processes) and which help manufacturers to take the appropriate action, rectify problems or improve manufacturing processes so as to produce good quality products. As an essential tool, researchers have always paid attention to the development of process control charts. Also, the sample sizes required for establishing control charts are often under discussion depending on the field of study. Of late, the problem of Fuzziness and Randomness often brought into modern manufacturing processes by the shortening product life cycles and diversification (in product designs, raw material supply etc) has compelled researchers to invoke quality control methodologies in their search for high customer satisfaction and better market shares (Guo et al 2006). We herein focus our attention on small sample sizes and focus on the development of quality control charts in terms of the Economic Design of Quality Control Charts; based on credibility measure theory under Random Fuzzy Measurements and Small Sample Asymptotic Distribution Theory. Economic process data will be collected from the study of Duncan (1956) in terms of these new developments as an illustrative example. or/Producer, otherwise they are undertaken with respect to the market as a whole. The techniques used for tackling the complex issues are diverse and wide-ranging as ascertained from the existing literature on the subject. The global ideology focuses on combining two streams of thought: the production optimisation and equilibrium techniques of the old monopolistic, cost-saving industry and; the new dynamic profit-maximising and risk-mitigating competitive industry. Financial engineering in a new and poorly understood market for electrical power must now take place in conjunction with - yet also constrained by - the physical production and distribution of the commodity.
- ItemOpen AccessStatistical-grey consistent grey differential equation modelling(2007) Cui, Yan Hong; Guo, RenkuanIncludes abstract. Includes bibliographical references (p. 152-156).