"WordSense team on OpenAI Whisper Hackathon Hackathon"

Team Idea

AIM: using tech to help those with disabilities navigate daily life with sensory feedback like haptic touch IDEA: - using OpenAI's Whisper to translate noises and emotions into haptic feedback - currently, those with hearing disabilities may decipher verbal communication through mouth and lip movements - what if there was a tool that allowed people with hearing disabilities to FEEL the sounds around them like verbal communication, background and street noises? (e.g. when someone is shouting around you, it gets translated into haptics) - what if emotions could be sensations by detecting pace and intonation of words and translating it into haptics? - redirecting sensory feedback in this form can allow those with disabilities to interact and understand the world in new, profound ways ABOUT: Hi! I am from Belgium with Indian origins and am currently in the United States pursuing a degree in design and technology. I am deeply interested in the application of AI in different fields and in helping communities facing certain issues. I have experience in AI and ML tools, and have been actively exploring the fields of sustainability, ethics, health and disability. I am very open to new ideas-- this is just an idea that I had, but would love to hear your ideas and thoughts too!
frypie16
Heeya Mody
frypie16

fahad_shihab546
Fahad Shihab
fahad_shihab546

karthik_kaiplody566
KARTHIK KAIPLODY
karthik_kaiplody566

devilmls372
Merik LS
devilmls372