To overcome this challenge, we will make a smart PC that empowers a physically challenged person to control a computational system with the help of facial movements. This is our first version of the prototype we will update it soon with the next version with more improvements in the control system.
Coordinate movements in the PC =(1080,720) to (0,0)
where x= 1080, y= 720
The solution works on three different functions:
- Blinking detection of the right eye for system operation
- Detection of the eye movement using image processing
- Translation of the eye movement and blink to control the GUI of PC
Above functions come with several different challenges, which we have solved using various steps for the device to work perfectly. They are:
- Differentiating between natural and intentional eye blink
Our eyes involuntarily blink at regular time intervals so that they stay free from dust and moisture. In order to perform a left or right mouse click, here the eyes will have to blink intentionally. However, there has to be some form of distinguishing function between involuntary and intentional eye blinks so that the device function is correctly performed.
To do that, integrate two sensors so that when both of them detect an eye blink at the same time, then it is considered as a natural eye blink. If the blinking happens in only one eye (either left or right), then it is detected as an intentional eye blink, resulting in a left or right mouse click.
Although this technique is highly effective, there is one problem – it increases the bill of materials to several hundred rupees. Also, people having problems with one eye can’t use this solution.
Therefore, this solution has been modified with the inclusion of a code, in which if an eye blinks within a second, then it is detected as natural. And if an eye blinks for more than a few seconds, then it is detected as intentional.
- Detecting the eye movement using image processing
For a mouse cursor to move, the human eye needs to refer to at least one object and move the mouse relative to it. Using eye movements, a physically challenged person can effortlessly operate a PC. But image processing of eye movements does not give accurate results, work in low light/completely dark conditions and the entire process is quite difficult.
Thus, a light mounted on the sensor tracks eye movements whenever the person moves his/her head.
.Translating the eye movement and blink to control the GUI of PC
We have developed a solution for detecting eye blinks and got the light movement mounted on the eyeglass. But their value is useless without putting them on the PC GUI to obtain accurate movement of the mouse cursor. While it needs to cover the entire length and width of a PC monitor, a human head can move only up to a certain degree. To solve this problem, a small head movement should convert into large pixel movements for the mouse cursor.
We have discussed the aims and challenges of the project. Now we need to develop the basics.
First, prepare the SD card with the latest Raspbian OS and check whether it has a pre-installed Python IDLE. Next, install the following Python libraries and modules for the project:
To do so, open the terminal and use the following commands:
sudo pip3 install python-opencv
sudo pip3 install numpy
sudo pip3 install gpiozero
sudo pip3 install pynput
sudo pip3 install nmap
After installing all the libraries, include the OpenCV official GitHub repository inside the Raspberry Pi using the following command.
git clone https://github.com/opencv/opencv
Now we are ready to code.
Here, you will need to modify the example code found in the OpenCV library folder and fuse your code in it to prepare the device. For that, open the OpenCV folder → platforms folder → python and select the mouse.py code. Copy and paste it in a new text file named headcontrolGUI.py and then save it.
You will need to get the eye blink sequence from the sensor for a mouse click. To get it, access the gpio pins and import the gpiozero module into the code to read the blink sensor data.
Also, import the pynput module to create a virtual mouse input for the GUI of Raspberry Pi OS.
Next, create an if condition to check the eye status. If the eye status = blink, then set four seconds to determine if it is an intentional blink or a natural one. If the blink is intentional, pass the pynput command to virtual mouse left or right click.
Put the eyeblink sensor on glass and wire it to a Raspberry Pi pin. Connect a camera to the RPi board and place it with more than 15 cm distance in the middle of both the eyes so that the entire face fits within the camera frame perfectly.
Wear the glass and run the code. Make sure to mark the red light in the eye (as shown in the video frame) with a mouse to track. When successful, a person can operate a PC with eye movements. To move the mouse, move an eye in left, right, upward and downward directions. To click the mouse, blink the eye up to four seconds.
Download Source Code