The design of a data model (DM) for managing durability index (DI) results for national road infrastructure
Master Thesis
2019
Permanent link to this Item
Authors
Supervisors
Journal Title
Link to Journal
Journal ISSN
Volume Title
Publisher
Publisher
Department
License
Series
Abstract
As part of a R 1.14 Billion 64-month concrete construction mega-project which began in May 2013, the Mt Edgecombe Interchange, comprising two incrementally launched bridges, the longest at 948 metres long and the other at 440 metres which joins uMhlanga and the N2 North, necessitates the demand to have adequate systems in place to measure durability compliance. Construction contracts of this nature exhibit thousands of test results that need to be assessed for variability, outliers and compliance for quality assurance in line with current performance-based specifications such as those contained in COTO (2018a; 2018b) derived from COLTO (1998) which requires judgement based on statistical principles. Since the inception of Durability Index (DI) performance-based specifications in 2008, over 12000 DI test results or determinations have accumulated within a repository at the University of Cape Town. As such, the performance-based approach in South Africa is now a decade into maturity and considerable amounts of actual site data are collected daily, and significant for refinements of the DI values in performance-based specifications, the long-term monitoring of Reinforced Concrete (RC) structures in a full-scale environment along with other research and development (R&D) initiatives. Data modelling can be defined as the process of designing a data model (DM) for data to be stored in a database. Commonly, a DM can be designated into three main types. A conceptual DM defines what the system contains; a logical DM defines how the system should be executed regardless of the Database Management System (DBMS); and a physical DM describes how the system will be executed using a specific DBMS system. The main objective of this study is to design a data model (DM) that is essentially a conceptual and logical representation of the physical database required to ensure durability compliance for RC structures. Database design principles are needed to execute a good database design and guide the entire process. Duplicate information or redundant data consume unnecessary storage as well as increase the probability of errors and inconsistencies. Therefore, the subdivision of the data within the conceptual data model (DM) into distinct groups or topics, which are broken down further into subject based tables, will help eliminate redundant data. The data contained within the database must be correct and complete. Incorrect or incomplete information will result in reports with mistakes and as such, any decisions based on the data will be misinformed. Therefore, the database must support and ensure the accuracy and integrity of the information as well as accommodate data processing and reporting requirements. An explanation and critique of the current durability specification has also been presented since information is required on how to join information in the database tables to create meaningful output. The conceptual data model (DM) established the basic concepts and the scope for the physical database through designing a modular structure or general layout for the database. This process established the entities or data objects (distinct groups), their attributes (properties of distinct groups) and their relationship (dependency of association between groups). The logical database design phase is divided into two main steps. In the first step, a data model (DM) is created to ensure minimal redundancy and capability for supporting user transactions. The output of this step is the creation of a logical data model (DM), which is a complete and accurate representation of the topics that are to be supported by the database. In the second step, the Entity Relationship Diagram (ERD) is mapped to a set of tables. The structure of each table is checked using normalization. Normalization is an effective means of ensuring that the tables are structurally consistent, logical, with minimal redundancy. The tables were also checked to ensure that they are capable of supporting the required transactions and the required integrity constraints on the database were defined The logical data model (DM) then added extra information to the conceptual data model (DM) elements through defining the database tables or basic information required for the physical database. This process established the structure of the data elements, set relationships between them and provided foundation to form the base for the physical database. A prototype is presented of the designed data model (DM) founded on 53 basic information database tables. The breakdown of database tables for the six modules is split according to references (1), concrete composition (13), execution (4), environment (7), specimens (2) and material tests (26). Correlations between different input parameters were identified which added further information to the logical data model (DM) elements by strengthening the relations between the topics. The extraction of information or output parameters according to specification limits was conducted through analysing data from five different projects which served as input for a total of 1054 DI test results or 4216 determinations. The results were used to conduct parametric studies on the DI values which predominantly affects concrete durability in RC structures. Lastly, a method is proposed using joint probability density functions of Durability Index (DI) test results and the achieved cover depth to calculate the probability that both random variables are out of specification limits.
Description
Keywords
Reference:
Govender, D. 2019. The design of a data model (DM) for managing durability index (DI) results for national road infrastructure. . ,Engineering and the Built Environment ,Department of Civil Engineering.