Brain-Computer Interface: Home Environment Control for Paralyzed
by Eesh Trivedi
A few words from the participant(s)
What steps did you take to develop your project?
I was intrigued by ALS after watching the movie “The Theory of Everything” about Stephen Hawking. I was inspired by how despite his disability, he was able to achieve such great success. It led me to want to know more about Amyotrophic Lateral Sclerosis. I learned how patients with ALS gradually lose all control of their voluntary muscles – losing the ability to move or even speak, all while having a perfectly preserved brain capable of achieving outstanding scientific achievements. There is no damage to the sensory system, so the patient can perceive the ambient light and temperature. I researched the system used by Stephen Hawking and found out that it was custom made for him by IBM, where he could use the twitch action of his face muscle to select a word or phrase between different options that toggled. Communication using the device was slow and painful, but he was able to convey his thoughts. I then decided to make a simpler, low-cost version of the device that could perform a similar function.
I decided to use a Raspberry pi based system because I had learned coding in Python at school.
I made the device using a commercially available single lead EEG detector that could parse out brain signals for eye blinking. The first version worked, but I realized that the system was too fast at toggling between the options, and it was not easy to time an eye blink to correctly control the device I needed to control. So I made alterations to my code and made a second version of the device that had increased times, making it easier to choose the device I wanted to control.
Why are you competing?
Currently, there are no simple systems available for helping a paralyzed person control his ambient environment to turn on a fan, turn a light on or off, and call for an attendant. For example, the system used by Stephen Hawking was custom-made by IBM, where he could use the twitch action of his face muscle to select a word or phrase. Communication using the device was slow and painful, but he was able to convey his thoughts.
The product I made provides a novel, workable solution at a low cost using a simple Raspberry-Pi processor to take input from a Brain-computer interface and use that to control the ambient environment, light, fan, and call-bell for a paralyzed person by using the code that I developed.
The product can be developed further for additional applications – for example, to control other functions for paralyzed people like controlling a motorized scooter.
It can additionally be developed for use in Video Gaming, where in addition to using the fingers and thumbs on both hands, additional actions could be controlled by an eye blink. Finally, it could prove helpful in Metaverse in VR meetings where eye blink signals could be used to control specific actions or devices in Metaverse.