Vis-Assist: Computer Vision and Haptic Feedback-Based Wearable Assistive Device for Visually Impaired

dc.contributor.author Dede, Ibrahim
dc.contributor.author Gumus, Abdurrahman
dc.date.accessioned 2025-06-26T20:19:14Z
dc.date.available 2025-06-26T20:19:14Z
dc.date.issued 2025
dc.description.abstract Visual impairment affects millions of people worldwide, posing significant challenges in their daily lives and personal safety. While assistive technologies, both wearable and non-wearable, can help mitigate these challenges, wearable devices offer the advantage of hands-free operation. In this context, we present Vis-Assist, a novel wearable visual assistive device capable of detecting and classifying objects, measuring their distances, and providing real-time haptic feedback through a vibration motor array, all using an integrated low-cost computational unit without the need for external servers. Our study distinguishes itself by utilizing haptic feedback to convey object information, allowing visually impaired individuals to discern between 19 different object classes following a brief training period. Haptic feedback offers an alternative to audio that doesn't block hearing and can be used alongside it, serving as a complementary solution. The performance of the developed wearable device was evaluated through two types of experiments with four participants. The results demonstrate that users can identify the location of objects and thereby prevent collisions with obstacles. The experiments conducted demonstrate that users, on average, can locate a predefined object, such as a chair, within a 40 m2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\hbox {m}<^>{2}$$\end{document} vacant space in under 94 seconds. Furthermore, users exhibit proficiency in finding objects while navigating around obstacles in the same environment, achieving this task in less than 121 seconds on average. The system developed here has high potential to help the self-navigation of visually impaired people and make their daily lives easier. To facilitate further research in this field, the complete source code for this study has been made publicly available on GitHub. en_US
dc.description.sponsorship Izmir Yuksek Teknoloji Enstitsu [2022IYTE-1-0066]; Scientific Research Projects Coordination Unit (BAP) en_US
dc.description.sponsorship We would like to thank you to Izmir Institute of Technology for their support in this project by Scientific Research Projects Coordination Unit (BAP) 2022IYTE-1-0066. en_US
dc.identifier.doi 10.1007/s12193-025-00452-5
dc.identifier.issn 1783-7677
dc.identifier.issn 1783-8738
dc.identifier.scopus 2-s2.0-105005780373
dc.identifier.uri https://doi.org/10.1007/s12193-025-00452-5
dc.identifier.uri https://hdl.handle.net/11147/15676
dc.language.iso en en_US
dc.publisher Springer en_US
dc.relation.ispartof Journal on Multimodal User Interfaces
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.subject Wearable Device en_US
dc.subject Visually Impaired en_US
dc.subject Assistive Technology en_US
dc.subject Haptic Feedback en_US
dc.subject Real-Time Detection en_US
dc.subject Edge-AI en_US
dc.title Vis-Assist: Computer Vision and Haptic Feedback-Based Wearable Assistive Device for Visually Impaired en_US
dc.type Article en_US
dspace.entity.type Publication
gdc.author.scopusid 59908418000
gdc.author.scopusid 35315599800
gdc.bip.impulseclass C5
gdc.bip.influenceclass C5
gdc.bip.popularityclass C5
gdc.coar.access metadata only access
gdc.coar.type text::journal::journal article
gdc.collaboration.industrial false
gdc.description.department İzmir Institute of Technology en_US
gdc.description.departmenttemp [Dede, Ibrahim; Gumus, Abdurrahman] Izmir Inst Technol, Dept Elect & Elect Engn, Izmir, Turkiye en_US
gdc.description.endpage 234
gdc.description.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
gdc.description.scopusquality Q2
gdc.description.startpage 217
gdc.description.volume 19
gdc.description.woscitationindex Science Citation Index Expanded
gdc.description.wosquality Q3
gdc.identifier.openalex W4410700508
gdc.identifier.wos WOS:001493346400001
gdc.index.type WoS
gdc.index.type Scopus
gdc.oaire.diamondjournal false
gdc.oaire.impulse 0.0
gdc.oaire.influence 2.635068E-9
gdc.oaire.isgreen false
gdc.oaire.popularity 2.1091297E-10
gdc.oaire.publicfunded false
gdc.openalex.collaboration National
gdc.openalex.fwci 4.0838049
gdc.openalex.normalizedpercentile 0.84
gdc.openalex.toppercent TOP 10%
gdc.opencitations.count 0
gdc.plumx.mendeley 4
gdc.plumx.scopuscites 0
gdc.scopus.citedcount 0
gdc.wos.citedcount 2
relation.isAuthorOfPublication.latestForDiscovery ce5ce1e2-17ef-4da2-946d-b7a26e44e461
relation.isOrgUnitOfPublication.latestForDiscovery 9af2b05f-28ac-4018-8abe-a4dfe192da5e

Files