Browsing by Author "Taylor, David"
Now showing 1 - 20 of 23
Results Per Page
Sort Options
- ItemOpen AccessBid-Ask Spread Modelling in the South African Bond Market(2018) Shaw, Matthew; Mohamed, Obeid; Taylor, DavidPitsillis and Taylor (2014) calculate bid-ask spread estimates of South African government bonds over a single year, using the models of De Jong and Rindi (2009) and Huang and Stoll (1997). This dissertation tests the effectiveness of both models by comparing the modelled equity spread estimates against the actual equity spread estimates. Furthermore, this dissertation investigates the stability of the De Jong and Rindi (2009) and Huang and Stoll (1997) models in the bond market by extending the spread estimate dataset to run annually over 5 years. The final section of this dissertation proposes a new method of estimating the bond spread through the use of a Kalman filter, as it can be used to leverage information from an onscreen market (albeit a different market) to imply bid-ask spread estimates in an off-screen market. The results indicate that the Huang and Stoll (1997) model consistently outperforms the De Jong and Rindi (2009) model. Furthermore, the yield estimate results of Pitsillis and Taylor (2014) align with the results obtained in this dissertation. The spread estimate results are stable over the 5-year period, indicating a strong provision of liquidity by the Primary Dealers.
- ItemOpen AccessBismut–Elworthy–Li formula for subordinated Brownian motion applied to hedging financial derivatives(Taylor and Francis, 2017-09-27) Kateregga, Michael; Mataramvura, Sure; Taylor, Davidhe objective of the paper is to extend the results in Fournié, Lasry, Lions, Lebuchoux, and Touzi (1999), Cass and Fritz (2007) for continuous processes to jump processes based on the Bismut–Elworthy–Li (BEL) formula in Elworthy and Li (1994). We construct a jump process using a subordinated Brownian motion where the subordinator is an inverse 훼-stable process (Lt )t≥0 with (0, 1]. The results are derived using Malliavin integration by parts formula. We derive representation formulas for computing financial Greeks and show that in the event when Lt ≡ t, we retrieve the results in Fournié et al. (1999). The purpose is to by-pass the derivative of an (irregular) pay-off function in a jump-type market by introducing a weight term in form of an integral with respect to subordinated Brownian motion. Using MonteCarlo techniques, we estimate financial Greeks for a digital option and show that the BEL formula still performs better for a discontinuous pay-off in a jump asset model setting and that the finite-difference methods are better for continuous pay-offs in a similar setting. In summary, the motivation and contribution of this paper demonstrates that the Malliavin integration by parts representation formula holds for subordinated Brownian motion and, this representation is useful in developing simple and tractable hedging strategies (the Greeks) in jump-type derivatives market as opposed to more complex jump models.
- ItemOpen AccessBreak-Even Volatility(2019) Mitoulis, Nicolas; Taylor, David; Mahomed, ObeidA profit or loss (P&L) of a dynamically hedged option depends on the implied volatility used to price the option and implement the hedges. Break-even volatility is a method of solving for the volatility which yields no profit or loss based on replicating the hedging procedure of an option on a historical share price time series. This dissertation investigates the traditional break-even volatility method on simulated data, how the break-even formula is derived and details the implementation with reference to MATLAB. We extend the methodology to the Heston model by changing the reference model in the hedging process. Resultantly, the need to employ characteristic function pricing methods arises to calculate the Heston model sensitivities. The break-even volatility solution is then found by means of an optimisation of the continuously delta hedged P&L over the Heston model parameters.
- ItemOpen AccessCalibrating the LIBOR market model to swaptions with an extension for illiquidity in South Africa(2016) Moodliyar, Leenesh; Taylor, DavidThe popularity of the LIBOR Market Model (LMM) in interest rate modelling is a result of its consistency with market practice of pricing interest rate derivatives. In the context of a life insurance company, the LMM is calibrated to swaptions as they are actively traded for a wide variety of maturities and they serve as the natural hedge instruments for many of the long dated maturity products with embedded options. Before calibrating the model we extend the calibration process to address the issue of illiquidity in the South African swaption market. The swaption surface used in calibrating the model is generated with market implied quotes for the hedgeable component and thereafter using historical volatilities for the unhedgeable or illiquid component. Rebonato's 3 parameter correlation function proposed by Rebonato (2005) provides the best fit to historical data. We assume a general piecewise constant parameterisation for the instantaneous forward rate volatilities. These volatilities are then determined analytically using the Rectangular Cascade Calibration Algorithm from Brigo and Morini (2006). The calibration generates a stable volatility term structure with the instantaneous forward rate volatilities being positive and real. Through an extension of the calibration we are able to capture the benefits of a pure replication component and accommodate a large unhedgeable component in the price faced by life insurance companies in South Africa.
- ItemOpen AccessCollaboration networks in economic science(2018) Rose, Michael E; Georg, Co-Pierre; Taylor, DavidWhen preparing a research article, Economists receive feedback from other academics, present on conference and give talks in seminars. This form of collaboration is termed informal because informal collaborators have, unlike authors, no formal property rights associated with their contribution. However, informal collaboration is so widespread that it appears to be part of the academic production function. Yet, it has received little attention in academia, least in Economics where patterns of informal collaboration differ from that of natural sciences. Social informal collaboration, the provision of direct feedback, gives rise to a social network. This thesis examines this network. The analysis focuses on the role of individual scientists in the network, which is estimated by different network centralities. Data originate from about 6000 published research articles from six Financial Economics journals between 1997 and 2011. A theoretical model describes how network centrality proxies the effort informal collaborators exert informally in a project, and how this improves the citation count of the research paper. We then investigate how observable characteristics of authors determine this and other centrality measures and find that common metrics such as productivity and number of citations correlate little with network centrality. As information transmission is an important aspect of social networks we study how network centrality of Economists relates to placement outcomes of their students in the academic job market. These findings suggest that even informal networks matter in the production of academic research; that these networks contain information above currently used measures of scholarly influence in the profession; and that these networks are used to decrease information asymmetry in the academic labor market.
- ItemOpen AccessHistorically implied swaption skews using non-parametric methods(2016) Jackson, Evan; Flint, Dylan; Taylor, DavidThis dissertation aims to derive historically realised volatilities for swaptions of a long-term nature within the South African market, which is illiquid and over-the- counter. To achieve this the dissertation adopts and constructs non-parametric methods which only make use of historical realised data of the underlying variable rather than any implied pricing history of the derivative itself. Stutzer's method of canonical valuation (1996) is adapted for use with interest rate derivatives of a long-term nature. However, under a simulation of swaption prices, canonical valuation is found to have a monotonic increase in pricing error for swaptions of maturities over 2 to 15 years. A new method is constructed, named the relative entropy approach, which is based on the work of Buchen and Kelly (1996) and is capable of pricing long-term interest rate derivatives using a smoothed continuous distribution of the historical realised data of the underlying variable only, while market implied pricing data can also be incorporated for calibration of the derivative to current market prices. Under simulation this method maintains consistent and bounded pricing error across swaption maturities of up to 15 years. This method is then used to obtain historical realised volatilities for swaptions of a long-term nature. The derived ten-year tenor swaption skews under the relative entropy approach observe smile characterisitcs similar to that of the market implied skew over short-term maturities and maintain a volatility smile, albeit diminishing, across moneyness for maturities up to 20 years. The skews are further tested for sensitivity to the input historical data as well as the precision of the skew under implementation of the relative entropy approach. Results show the derived swaption skews to be robust when using a historical data set greater than 1200 observations. The swaption skew is sensitive to the nature of the historical data used which is representative of particular market characteristics over certain historical periods. The relative entropy approach is concluded capable of pricing long-term swaptions in a market where little or no option pricing data exists and could be considered for use in practical applications.
- ItemOpen AccessMedical students' attitudes towards and perceptions of the Primary Health Care approach(2005) Draper, Catherine; Louw, Graham; Taylor, David; Gibbs, TrevorThe aim of this research was to provide an understanding of medical students' attitudes towards and perceptions of the PHC approach, and this was done using mainly qualitative methods, namely focus groups, interviews, and one questionnaire. This research also investigated students' views of the way in which the PHC approach was taught, their understanding of the PHC approach, what could influence students' views of the PHC approach, the appropriateness of the PHC approach in South Africa, their opinions of the fact that UCT has a PHC-driven MBChB curriculum, their views of the role of doctors in the PHC approach, and a number of other related issues. The main findings were that students enter their medical degree with an expectation of a biomedical emphasis and a lecture-based curriculum.
- ItemOpen AccessModelling Equities with a Stochastic Volatility Jump Diffusion(2018) Gorven, Matthew; Mahomed, Obeid; Taylor, DavidThe Bates model provides a parsimonious fit to implied volatility surfaces, and its usefulness in developed markets is well documented. However, there is a lack of research assessing its applicability to developing markets. Additionally, research surrounding its usefulness for hedging long term liabilities is limited, despite its frequent use for this purpose. This dissertation dissects the dynamics of the Bates model into the Heston and Merton models in order to separately examine the effects of stochastic volatility and jumps. Challenges surrounding application of this model are investigated through an evaluation of risk-neutral calibration and simulation methods. The model’s ability to fit the implied volatility surfaces from the JSE Top 40 equity index is analysed. Lastly, an evaluation of the model’s delta and vega hedging performance is presented by comparing it to the hedge performance of other commonly used models.
- ItemMetadata onlyModelling illiquid volatility skews(2014) Crowther, Servaas Marcus; Mahomed, Obeid; Taylor, DavidMost markets trade liquidly in options on the market index, in fact they often trade at a wide range of strike levels. Thus, using the Black-Scholes model, we can obtain the implied volatilities at the various strike levels, forming the associated implied volatility skew of the respective market under consideration. This, however, is not always feasible when it comes to the individual stocks within the market, as single stock options trade a lot less frequently. This dissertation makes use of data from the Eurozone, in particular we consider the Euro Stoxx 50 market index and its underlying constituents. Options written on the Euro Stoxx 50 and its constituents are highly liquid, and volatility skews are obtained for the market as well as for most of the single stocks within the market. I then artificially created 3 cases of illiquid markets, each with increasing degrees of sparseness mimicking various possible realities. Using principal component analysis, this dissertation aims to find an appropriate model for relating the volatility skew of the index to that of single stocks within the market in order to fill gaps in the data of the skews of the individual stocks. Results indicate that simpler models perform similarly in all scenarios of sparse- ness whereas the performance of more complex models decrease as the data becomes sparser. This indicates that basic relationships can be formed between the index and single stocks in cases with relatively low levels of trade in the market but more accurate estimates are more difficult to achieve. However, if we use the skew data, as is, as an input to the models, their performance remains by and far the same using the full data set and using monthly information. This is encouraging, as it means we can fill gaps in the individual stocks' skew data with as good a fit as if we modeled with a full set of data.
- ItemOpen AccessModelling the South African Inter-Bank Interest Rate Market using a Log-Normal Rational Pricing Kernel Model(2019) Hammond, Graeme; Taylor, David; Mahomed, ObeidThis dissertation examines the performance of two log-normal rational pricing kernel models and their calibration to the South African Inter-bank interest rate market. We investigate using Monte-Carlo simulation to price caps, floors and swaptions. Model-performance for both models was tested on single-strikes and entire volatility surfaces. Our results show that a one-factor model cannot reproduce the volatility smile present in the caps/floor market but can reproduce the at-the money swaption volatility surface. The two-factor model produces a better calibration to the volatility smile and captures most of the characteristics of the volatility surface.
- ItemOpen AccessMulti-curve bootstrapping and implied discounting curves in illiquid markets(2017) Sender, Nina Alexandra; Taylor, DavidThe credit and liquidity crisis of 2007 has triggered a number of inconsistencies in the interest rate market, questioning some of the standard methods and assumptions used to price and hedge interest rate derivatives. It has been shown that using a single risk-free curve (constructed from market instruments referencing underlying rates of varying tenors) to forecast and discount cash flows is not theoretically correct. Standard market practice has evolved to a multi-curve approach, using different curves to forecast and discount cash flows. The risk-free discount curve is proxied by the Overnight-Indexed Swap (OIS) curve. In South Africa there is no liquid market for OIS. In this dissertation a method is developed to estimate the ZAR OIS curve. A cointegration relationship between the SAFEX Overnight Rate, and the 3-month JIBAR rate is shown to exist. This relationship is used in a dual bootstrap algorithm, to simultaneously estimate the ZAR OIS curve and 3-month JIBAR tenor curve, while maintaining arbitrage relationships. The tractability of this method is shown, by pricing options written on ZAR OIS.
- ItemOpen AccessMulti-curve frameworks and information-based models(2024) Mahomed, Obeid; Taylor, David; Mc Walter ThomasThe distinction between bank funding cash and derivative markets were magnified in the aftermath of the 2008 global financial crisis, and further fortified by the need for reference rate reform post the Financial Stability Board's review of major interest benchmarks in 2014. The cognisance of previously negligible liquidity and credit risks has had various implications for market microstructure and risk management. Accordingly, this has created the need for new interest rate modelling frameworks. Part I proposes one such framework, referred to as the “market-based approach”, which is a multi-curve generalisation of the single-curve pricing kernel approach, and is motivated by material differences that emerge due to term-related risks when executing compounding strategies at different frequencies. In this framework, a distinct stochastic discount factor is assigned to each tradable term within a given market. This term-cognisant approach is first applied to the deposit market, where a novel argument based on funding-swap duality and a constructed stylised systemic and symmetric setting enables the derivation of a system of arbitrage-free discrete-time calibrated pricing kernels. It is then shown how one may construct an exchange of risk mechanism to transfer risks across terms in a fair manner, which in turn enables economically meaningful and theoretically consistent pricing and valuation of financial instruments with features that span across terms. Finally, it is shown that the repo and bank funding markets are also compatible with the market-based approach, which paves the way for the development of derivative pricing and valuation. In Part II, the exchange of risk mechanism is generalised using a system of continuous-time pricing kernels and an FX analogy which results in the creation of the curve-conversion factor process. This process is then used to derive the across-curve pricing formula, which is a generalisation of the fundamental single pricing kernel formula, and defines the arbitrage-free mechanics of the “xy-approach” — a continuous-time reduced-form abstraction of the market-based approach. As a natural application, consistent multi-curve frameworks are formulated for bank funding cash and derivative markets within emerging and developed economies. Given the xy-approach, existing multi-curve frameworks based on HJM and rational pricing kernel models are recovered, reviewed, and generalised; and single-curve models are extended to a multicurve setting. In a final application, it is shown how the xy-approach offers a flexible framework for solving pricing problems involving financial instruments with floating nominal rate, inflation and foreign exchange exposures, in a consistent manner. Part III presents a reformulation of the information-based asset pricing framework, introduced by Macrina (2006), within a general non-linear stochastic filtering framework founded upon Markov observation and signal processes, in order to enhance tractability for model development. A general framework for modelling short, instantaneous forward, and discrete forward rates using pricing kernels is derived, which enables the creation of informatio
- ItemOpen AccessNon-bank financial intermediation – a focus on South Africa(2022) Kemp, Esti; Georg, Co-Pierre; Taylor, DavidWe measure the non-bank financial intermediation (NBFI) sector of South Africa over time, and how it is connected to the banking system. While the growth of the NBFI-sector has outpaced that of banks - driven mainly by collective investment schemes, banks continue to hold the largest share of financial assets. We show relatively high levels of interconnectedness between banks and non-banks, specifically investment funds. We also show high levels of portfolio overlap, or indirect interconnectedness, for money market funds (MMFs) registered in South Africa. Given the limited academic work measuring interconnectedness beyond banks, as a second part of the work, a novel dataset is used to analyse the interconnectedness in South Africa in more detail. We propose a method to compute losses on the financial system as a result of a failure of the bank based on look-through exposures - i.e. those beyond the direct and indirect balance sheet links. Specifically, we show that the exposures of financial institutions in the SA financial system to the default of one one of its "big six" banks may be severely underestimated when only considering direct and indirect exposures. The default of one of the big six banks causes financial distress to spread throughout the system. Consequently, additional losses accumulate to institutions over time that are not covered by the direct and indirect exposures. We introduce the higher-order share of exposure (HSE), which expresses what percentage of an exposure is overlooked when only considering direct and indirect exposures. We show that the HSE is close to 100% for a substantial part of the South African financial system, and that in other parts the HSE rises steeply during times of financial distress, when exposures matter most. We show that these higher order losses depend strongly on the network structure of the SA financial system and the robustness of its institutions. In a new domain of estimating exposures, we confirm an earlier established result, which finds that jointly including multiple asset classes and multiple types of financial exposures is requisite to avoid underestimating losses. This highlights the importance of granular data, and network-based modeling approaches that take advantage of these data to properly estimate exposures. The third part of the work focuses on identifying and measuring the financial cycle in South Africa, using three different methodologies. The financial cycle is calculated using credit, house prices and equity prices as indicators, and estimated using traditional turning-point analysis, frequency-based filters and an unobserved components model-based approach. We then consider the financial cycle's main characteristics and examine its relationships with the business cycle. We confirm the presence of a financial cycle in South Africa that has a longer duration and a larger amplitude than the traditional business cycle. Developments in measures of credit and house prices are important indicators of the financial cycle, although the case for including equity prices in the measures is less certain. Periods where financial conditions are stressed are associated with peaks in the financial cycle, suggesting that the estimated financial cycle may have similar leading indicator properties to financial conditions or stress indices. To determine the role of the NBFI sector in the financial cycle, in the final part of the work we also estimate the non-bank credit cycle. This methodology is applied to estimate the non-banking cycle of several economies, to gain insights into differences with the bank credit cycle. We find that the cyclical properties of non-bank credit cycles differ from those of bank credit: while the duration is similar, the amplitude of non-bank credit is relatively larger. The relationship between bank and non-bank credit is not stable and differs among jurisdictions, at a global level this relationship becomes less synchronised in the period leading up to the 2008 financial crisis. We argue that monitoring non-bank credit can bring additional information as a leading indicator for periods of financial instability, in particular currency crises. We complement the existing literature on leading indicators for financial crises by showing that bank credit is a useful indicator for systemic banking crises, while non-bank credit is helpful to predict currency crises, but not vice versa.
- ItemOpen AccessOptimal tree methods(2014) Rudd, Ralph; McWalter, Thomas; Taylor, DavidAlthough traditional tree methods are the simplest numerical methods for option pricing, much work remains to be done regarding their optimal parameterization and construction. This work examines the parameterization of traditional tree methods as well as the techniques commonly used to accelerate their convergence. The performance of selected, accelerated binomial and trinomial trees is then compared to an advanced tree method, Figlewski and Gao's Adaptive Mesh Model, when pricing an American put and a Down-And-Out barrier option.
- ItemOpen AccessParameter Estimation for Stable Distributions with Application to Commodity Futures Log-Returns(Taylor and Francis, 2017-05-02) Kateregga, Michael; Mataramvura, Sure; Taylor, DavidThis paper explores the theory behind the rich and robust family of α-stable distributions to estimate parameters from financial asset log-returns data. We discuss four-parameter estimation methods including the quantiles, logarithmic moments method, maximum likelihood (ML), and the empirical characteristics function (ECF) method. The contribution of the paper is two-fold: first, we discuss the above parametric approaches and investigate their performance through error analysis. Moreover, we argue that the ECF performs better than the ML over a wide range of shape parameter values, α including values closest to 0 and 2 and that the ECF has a better convergence rate than the ML. Secondly, we compare the t location-scale distribution to the general stable distribution and show that the former fails to capture skewness which might exist in the data. This is observed through applying the ECF to commodity futures log-returns data to obtain the skewness parameter.
- ItemOpen AccessPortfolio selection using Random Matrix theory and L-Moments(2015) Ushan, Wardah; Bosman, Petrus; Taylor, DavidMarkowitz's (1952) seminal work on Modern Portfolio Theory (MPT) describes a methodology to construct an optimal portfolio of risky stocks. The constructed portfolio is based on a trade-off between risk and reward, and will depend on the risk- return preferences of the investor. Implementation of MPT requires estimation of the expected returns and variances of each of the stocks, and the associated covariances between them. Historically, the sample mean vector and variance-covariance matrix have been used for this purpose. However, estimation errors result in the optimised portfolios performing poorly out-of-sample. This dissertation considers two approaches to obtaining a more robust estimate of the variance-covariance matrix. The first is Random Matrix Theory (RMT), which compares the eigenvalues of an empirical correlation matrix to those generated from a correlation matrix of purely random returns. Eigenvalues of the random correlation matrix follow the Marcenko-Pastur density, and lie within an upper and lower bound. This range is referred to as the "noise band". Eigenvalues of the empirical correlation matrix falling within the "noise band" are considered to provide no useful information. Thus, RMT proposes that they be filtered out to obtain a cleaned, robust estimate of the correlation and covariance matrices. The second approach uses L-moments, rather than conventional sample moments, to estimate the covariance and correlation matrices. L-moment estimates are more robust to outliers than conventional sample moments, in particular, when sample sizes are small. We use L-moments in conjunction with Random Matrix Theory to construct the minimum variance portfolio. In particular, we consider four strategies corresponding to the four different estimates of the covariance matrix: the L-moments estimate and sample moments estimate, each with and without the incorporation of RMT. We then analyse the performance of each of these strategies in terms of their risk-return characteristics, their performance and their diversification.
- ItemOpen AccessRealised volatility estimators(2014) Königkrämer, Sören; Taylor, DavidThis dissertation is an investigation into realised volatility (RV) estimators. Here, RV is defined as the sum-of-squared-returns (SSR) and is a proxy for integrated volatility (IV), which is unobservable. The study focuses on a subset of the universe of RV estimators. We examine three categories of estimators: historical, high-frequency (HF) and implied. The need to estimate RV is predominantly in the hedging of options and is not concerned with speculation or forecasting. The main research questions are; (1) what is the best RV estimator in a historical study of S&P 500 data? (2) What is the best RV estimator in a Monte Carlo simulation when delta hedging synthetic options? (3) Do our findings support the stylized fact of `Asymmetry in time scales' (Cont, 2001)? In the answering of these questions, further avenues of investigation are explored. Firstly, the VIX is used as the implied volatility. Secondly, the Monte Carlo simulation generates stock price paths with random components in the stock price and the volatility at each time point. The distribution of the input volatility is varied. The question of asymmetry in time scales is addressed by varying the term and frequency of historical data. The results of the historical and Monte Carlo simulation are compared. The SSR and two of the HF estimators perform best in both cases. Accuracy of estimators using long term data is shown to perform very poorly.
- ItemRestrictedRecovery theorem: expounded and applied(2014) Backwell, Alex; Taylor, DavidThis dissertation is concerned with Ross' (2011) Recovery Theorem. It is generally held that a forward-looking probability distribution is unobtainable from derivative prices, because the market's risk-preferences are conceptually inextricable from the implied real-world distribution. Ross' result recovers this distribution without making the strong preference assumptions assumed necessary under the conventional paradigm. This dissertation aims to give the reader a thorough understanding of Ross Recovery, both from a theoretical and practical point of view. This starts with a formal delineation of the model and proof of the central result, motivated by the informal nature of Ross' working paper. This dissertation relaxes one of Ross' assumptions and arrives at the equivalent conclusion. This is followed by a critique of the model and assumptions. An a priori discussion only goes so far, but potentially problematic assumptions are identified, chief amongst which being time additive preferences of a representative agent. Attention is then turned to practical application of the theorem. The author identifies a number of obstacles to applying the result { some of which are somewhat atypical and have not been directly addressed in the literature { and suggests potential solutions. A salient obstacle is calibrating a state price matrix. This leads to an implementation of Ross Recovery on the FTSE/JSE Top40. The suggested approach is found to be workable, though certainly not the final word on the matter. A testing framework for the model is discussed and the dissertation is concluded with a consideration of the findings and the theorem's applicability.
- ItemOpen AccessStable processes: theory and applications in finance(2017) Kateregga, Michael; Mataramvura, Sure; Taylor, DavidThis thesis is a study on stable distributions and some of their applications in understanding financial markets. Three broad problems are explored: First, we study a parameter and density estimation problem for stable distributions using commodity market data. We investigate and compare the accuracy of the quantile, logarithmic, maximum likelihood (ML) and empirical characteristic function (ECF) methods. It turns out that the ECF is the most recommendable method, challenging literature that instead suggests the ML. Secondly, we develop an affine theory for subordinated random processes and apply the results to pricing commodity futures in markets where the spot price includes jumps. The jumps are introduced by subordinating Brownian motion in the spot model by an α-stable process, α ε (0; 1] which leads to a new pricing approach for models with latent variables. The third problem is the pricing of general derivatives and risk management based on Malliavin calculus. We derive a Bismut-Elworthy-Li (BEL) representation formula for computing financial Greeks under the framework of subordinated Brownian motion by an inverse α-stable process with α ε (0; 1]. This subordination by an inverse α-stable process allows zero returns in the model rendering it fit for illiquid emerging markets. In addition, we demonstrate that the model is best suited for pricing derivatives with irregular payoff functions compared to the traditional Euler methods.
- ItemRestrictedSubordinated affine structure models for commodity future prices(Taylor and Francis, 2018-08-20) Kateregga, Michael; Mataramvura, Sure; Taylor, DavidTo date the existence of jumps in different sectors of the financial market is certain and the commodity market is no exception. While there are various models in literature on how to capture these jumps, we restrict ourselves to using subordinated Brownian motion by an α-stable process, α ∈ (0,1), as the source of randomness in the spot price model to determine commodity future prices, a concept which is not new either. However, the key feature in our pricing approach is the new simple technique derived from our novel theory for subordinated affine structure models. Different from existing filtering methods for models with latent variables, we show that the commodity future price under a one factor model with a subordinated random source driver, can be expressed in terms of the subordinator which can then be reduced to the latent regression models commonly used in population dynamics with their parameters easily estimated using the expectation maximisation method. In our case, the underlying joint probability distribution is a combination of the Gaussian and stable densities.