Eyespeak is the first autonomous system with an augmented reality interface by eye-tracking that enables the possibility of communicating through the user’s eyes, in any position and orientation of the user's head. It will consist of a pair of augmented reality glasses that will project a virtual keyboard (or the display of your computer if you connect it to it, by remote desktop) onto your field of view.
The glasses will have a micro camera looking at the user’s eyes to understand which key is being selected. After writing a word or set of words, the user will be able to select the “speak” button that will speak what has been written in a synthetic voice, through a speaker that is integrated into the glasses.
The system is based on Epson's BT200 smart glasses on which a microcamera, a microphone and a speaker will be integrated. All these will be controlled by a microprocessor unit that will monitor the user's eyes position.