Repository logo
 

Autonomous classification and spatial location of objects from stereoscopic image sequences for the visually impaired

Thumbnail Image

Date

2022-07-20

Authors

Sivate, Themba M.
Pillay, Nelendran
Moorgas, Kevin
Singh, Navin

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Abstract

One of the main problems faced by visually impaired individuals is the inability or difficulty to identify objects. A visually impaired person usually wears glasses that help to enlarge or focus on nearby objects, and therefore heavily relies on physical touch to identify an object. There are challenges when walking on the road or navigating to a specific location since the vision is lost or reduced thereby increasing the risk of an accident. This paper proposes a simple portable machine vision system for assisting the visually impaired by providing auditory feedback of nearby objects in real-time. The proposed system consists of three main hardware components consisting of a single board computer, a wireless camera, and an earpiece module. YOLACT object detection library was used to detect objects from the captured image. The objects are converted to an audio signal using the Festival Speech Synthesis System. Experimental results show that the system is efficient and capable of providing audio feedback of detected objects to the visually impaired person in real-time.

Description

Keywords

Visually impaired, Single-board computer, Portable, Bluetooth, Computer vision, Audio feedback, Object detection

Citation

Sivate, T.M. et al. Autonomous classification and spatial location of objects from stereoscopic image sequences for the visually impaired. Presented at: 2022 International Conference on Electrical, Computer and Energy Technologies (ICECET). doi:10.1109/icecet55527.2022.9872538

DOI

10.1109/icecet55527.2022.9872538

Endorsement

Review

Supplemented By

Referenced By