Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 926 Bytes

File metadata and controls

5 lines (3 loc) · 926 Bytes

AI-based-system-application-control

AI Technology to assist elderly and mobility-impaired individuals

This innovative hands-free human-computer interaction system relies on facial landmarks and advanced computer vision technologies, particularly utilizing Dlib's face detection and landmark prediction. The project focuses on eye movements for cursor control, employing eye tracking to facilitate actions like left and right-clicking. Additionally, speech recognition is integrated for system control, eliminating the need for physical contact with traditional input devices. The deliberate exclusion of facial gestures, including mouth tracking, simplifies the interaction model, ensuring a streamlined and accessible user experience. This project signifies a cutting-edge advancement in hands-free computing, showcasing the potential of combining eye movement and speech recognition technologies for intuitive control.