Eye gaze navigation systems allow you to control your wheelchair, home automation, manipulator robot, and other objects by means of a camera that observes what you are looking at onto a computer screen and transcribes your commands onto your wheelchair or home automation.
It is very difficult to buy such a chair. There are many scientific publications on this subject, but only very few of these studies can be used by a person with a disability and even most realistic studies do not describe an adaptation to a commercial electric wheelchair.
One of the most realistic projects is from a person suffering from ALS (and since deceased) named Patrick Joyce.
- https://hackaday.io/project/5426-eye-controlled-wheelchair
- https://github.com/Sean3Don/Eyedrivomatic-github
The adaptation of the system to a commercial chair is done by a kind of robotic cap that fits over the joystick of the chair and manipulates it under the control of a program.
The user looks at a screen where different icons correspond to different actions (go left, right, stop, etc.). The program "observes" the position of the user's retinas, as well as eye blinks. From the position of the eyes, the program deduces which icons are looked at and therefore which commands are desired by the user.
Please note that the principle used to control the joystick, while universal, is not very precise.
In addition, the program itself must be adapted to the computer running it. It is therefore not a turnkey system.
Like other alternative control methods (EEG headset, muscle twitch detection), an eye navigation system will never be as accurate as traditional manual control. Although eye navigation systems can be used outdoors, it is strongly recommended that they are used only indoors. The main reason is due to the limitations of eye navigation cameras that operate in infrared. They therefore do not work reliably in direct sunlight, near a heat source, or even in cloudy conditions.
Another limitation is that this project does not have any sensors to detect collision risks in all four directions. Using a wheelchair with a joystick is already tricky and requires learning. The use of a gaze-based system is accident-prone and automatic assistance is needed to control movements and accelerations with respect to the wheelchair environment.
There was apparently a desire from Patrick Joyce to disseminate his project widely, but his death seems to have stopped the efforts of the community he had created via Hackaday.
Another realistic project is that of Bob Paradiso.
https://bobparadiso.com/2015/06/16/eye-gaze-controlled-wheelchair-robotic-arm-and-ecu/
https://github.com/bobparadiso/wheelchairEyeGazeECU/tree/master
The principle is the same as that of Patrick Joyce but here the electronic circuit is connected to the wheelchair joystick connector. Although older than that of Patrick Joyce, this project is more evolved, in addition to the control of the armchair, there is a home automation arm.
Although Joystick's protocol is relatively standardized, it is not certain that all chairs implement it. On the other hand, the precision in this project is necessarily better than in Patrick Joyce's project. Before planning to use a particular wheelchair, it is, therefore, necessary to ensure that the joystick uses a standard protocol and connector. Digital mode (not analog) PIN 1: Front PIN 2: Rear PIN 3: Left Pin 4: Right PIN 5: DETECT PIN 6: 5 button PINS 7 AND 9 – 12V 100MA PIN 8 – GROUND
The Arduino controller directly controls the motors of the robotic arm and the IR/RF transmitters which control the surrounding devices.
An obvious improvement to this project would be to control the wheelchair via Bluetooth. Indeed, some modern wheelchairs allow remote control via Bluetooth, so it may be possible to interface Bob Paradiso's project to such wheelchairs without having to interfere with their electronics.
As for Patrick Joyce's project, there are no sensors to prevent collisions.
Other interesting projects are: