Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study
dc.contributor.advisor | Tsoeu, Mohohlo S. | |
dc.contributor.author | Wolf, Ryan Evan | |
dc.date.accessioned | 2020-02-11T11:33:15Z | |
dc.date.available | 2020-02-11T11:33:15Z | |
dc.date.issued | 2019 | |
dc.date.updated | 2020-01-28T13:03:31Z | |
dc.description.abstract | For many mobile robotic systems, navigating an environment is a crucial step in autonomy and Visual Simultaneous Localisation and Mapping (vSLAM) has seen increased effective usage in this capacity. However, vSLAM is strongly dependent on the context in which it is applied, often using heuristic and special cases to provide efficiency and robustness. It is thus crucial to identify the important parameters and factors regarding a particular context as this heavily influences the necessary algorithms, processes, and hardware required for the best results. In this body of work, a generic front-end stereo vSLAM pipeline is tested in the context of a small-scale outdoor wheeled robot that occupies less than 1m3 of volume. The scale of the vehicle constrained the available processing power, Field Of View (FOV), actuation systems, and image distortions present. A dataset was collected with a custom platform that consisted of a Point Grey Bumblebee (Discontinued) stereo camera and Nvidia Jetson TK1 processor. A stereo front-end feature tracking framework was described and evaluated both in simulation and experimentally where appropriate. It was found that scale adversely affected lighting conditions, FOV, baseline, and processing power available, all crucial factors to improve upon. The stereo constraint was effective for robustness criteria, but ineffective in terms of processing power and metric reconstruction. An overall absolute odometer error of 0.25-3m was produced on the dataset but was unable to run in real-time. | |
dc.identifier.apacitation | Wolf, R. E. (2019). <i>Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study</i>. (). ,Engineering and the Built Environment ,Department of Electrical Engineering. Retrieved from http://hdl.handle.net/11427/31018 | en_ZA |
dc.identifier.chicagocitation | Wolf, Ryan Evan. <i>"Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study."</i> ., ,Engineering and the Built Environment ,Department of Electrical Engineering, 2019. http://hdl.handle.net/11427/31018 | en_ZA |
dc.identifier.citation | Wolf, R. 2019. Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study. | en_ZA |
dc.identifier.ris | TY - Thesis / Dissertation AU - Wolf, Ryan Evan AB - For many mobile robotic systems, navigating an environment is a crucial step in autonomy and Visual Simultaneous Localisation and Mapping (vSLAM) has seen increased effective usage in this capacity. However, vSLAM is strongly dependent on the context in which it is applied, often using heuristic and special cases to provide efficiency and robustness. It is thus crucial to identify the important parameters and factors regarding a particular context as this heavily influences the necessary algorithms, processes, and hardware required for the best results. In this body of work, a generic front-end stereo vSLAM pipeline is tested in the context of a small-scale outdoor wheeled robot that occupies less than 1m3 of volume. The scale of the vehicle constrained the available processing power, Field Of View (FOV), actuation systems, and image distortions present. A dataset was collected with a custom platform that consisted of a Point Grey Bumblebee (Discontinued) stereo camera and Nvidia Jetson TK1 processor. A stereo front-end feature tracking framework was described and evaluated both in simulation and experimentally where appropriate. It was found that scale adversely affected lighting conditions, FOV, baseline, and processing power available, all crucial factors to improve upon. The stereo constraint was effective for robustness criteria, but ineffective in terms of processing power and metric reconstruction. An overall absolute odometer error of 0.25-3m was produced on the dataset but was unable to run in real-time. DA - 2019 DB - OpenUCT DP - University of Cape Town KW - Engineering LK - https://open.uct.ac.za PY - 2019 T1 - Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study TI - Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study UR - http://hdl.handle.net/11427/31018 ER - | en_ZA |
dc.identifier.uri | http://hdl.handle.net/11427/31018 | |
dc.identifier.vancouvercitation | Wolf RE. Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study. []. ,Engineering and the Built Environment ,Department of Electrical Engineering, 2019 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/31018 | en_ZA |
dc.language.rfc3066 | eng | |
dc.publisher.department | Department of Electrical Engineering | |
dc.publisher.faculty | Faculty of Engineering and the Built Environment | |
dc.subject | Engineering | |
dc.title | Stereo visual simultaneous localisation and mapping for an outdoor wheeled robot: a front-end study | |
dc.type | Master Thesis | |
dc.type.qualificationlevel | Masters | |
dc.type.qualificationname | MSc |