[vc_row][vc_column][vc_custom_heading text=”Brain-Computer Interface: Home Environment Control for Paralyzed” google_fonts=”font_family:Fjalla%20One%3Aregular|font_style:400%20regular%3A400%3Anormal” font_container=”tag:h3|font_size:35|text_align:center|color:%23ac2d34″][vc_empty_space height=”20″][vc_custom_heading text=”by Eesh Trivedi” google_fonts=”font_family:Fjalla%20One%3Aregular|font_style:400%20regular%3A400%3Anormal” font_container=”tag:h2|font_size:20|text_align:center|color:%23000000″][/vc_column][/vc_row][vc_row row_overlay=”color” row_overlay_opacity=”5″ row_overlay_color=”#0f0f0f”][vc_column][vc_custom_heading text=”A few words from the participant(s)” google_fonts=”font_family:Fjalla%20One%3Aregular|font_style:400%20regular%3A400%3Anormal” font_container=”tag:h3|font_size:25|text_align:center|color:%23ac2d34″][vc_empty_space height=”20″][vc_custom_heading text=”What steps did you take to develop your project?” google_fonts=”font_family:Fjalla%20One%3Aregular|font_style:400%20regular%3A400%3Anormal” font_container=”tag:h4|font_size:20|text_align:center|color:%23000000″][vc_column_text]I was intrigued by ALS after watching the movie “The Theory of  Everything” about Stephen Hawking. I was inspired by how despite his disability, he was able to achieve such great success. It led me to want to know more about Amyotrophic Lateral Sclerosis. I learned how patients with ALS gradually lose all control of their voluntary muscles – losing the ability to move or even speak, all while having a perfectly preserved brain capable of achieving outstanding scientific achievements. There is no damage to the sensory system, so the patient can perceive the ambient light and temperature. I researched the system used by Stephen Hawking and found out that it was custom made for him by IBM, where he could use the twitch action of his face muscle to select a word or phrase between different options that toggled. Communication using the device was slow and painful, but he was able to convey his thoughts. I then decided to make a simpler, low-cost version of the device that could perform a similar function.

I decided to use a Raspberry pi based system because I had learned coding in Python at school. 

I made the device using a commercially available single lead EEG detector that could parse out brain signals for eye blinking. The first version worked, but I realized that the system was too fast at toggling between the options, and it was not easy to time an eye blink to correctly control the device I needed to control. So I made alterations to my code and made a second version of the device that had increased times, making it easier to choose the device I wanted to control.[/vc_column_text][vc_empty_space height=”20″][vc_custom_heading text=”Why are you competing?” google_fonts=”font_family:Fjalla%20One%3Aregular|font_style:400%20regular%3A400%3Anormal” font_container=”tag:h4|font_size:20|text_align:center|color:%23000000″][vc_column_text]Currently, there are no simple systems available for helping a paralyzed person control his ambient environment to turn on a fan, turn a light on or off, and call for an attendant. For example, the system used by Stephen Hawking was custom-made by IBM, where he could use the twitch action of his face muscle to select a word or phrase. Communication using the device was slow and painful, but he was able to convey his thoughts.

The product I made provides a novel, workable solution at a low cost using a simple Raspberry-Pi processor to take input from a Brain-computer interface and use that to control the ambient environment, light, fan, and call-bell for a paralyzed person by using the code that I developed. 

The product can be developed further for additional applications – for example, to control other functions for paralyzed people like controlling a motorized scooter.

It can additionally be developed for use in Video Gaming, where in addition to using the fingers and thumbs on both hands, additional actions could be controlled by an eye blink. Finally, it could prove helpful in Metaverse in VR meetings where eye blink signals could be used to control specific actions or devices in Metaverse.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_video link=”https://youtu.be/Xt-BIz-ixY8″][/vc_column][/vc_row]