Often, it is a challenge for visually impaired people to navigate through an unfamiliar indoor environment and find the desired location or public utility. Roshni, is the cell phone based system developed by a team of students from department of Computer Science and Engineering, IIT New Delhi along with the support of Saksham Trust, New Delhi and National Association for Blind, New Delhi which empowers the visually impaired people to navigate inside an unfamiliar building without any external sighted assistance.
Roshni is an affordable solution using infrared sensors which can be easily retrofitted to existing building infrastructure. The system is entirely controlled by the user and assistance is provided by the means of active audio messages coming from a mobile phone application, which also stores the map of the building.
When asked what prompted them to design such a solution Dhruv Jain, the team leader who completed his B.Tech from IIT Delhithis year, said, “According to the stats given by WHO, 87% of the blinds are in developing countries and India leads the race here. Presently there is no system available in India and other developing countries that allows a blind person to navigate freely in an indoor environment.”
“We incepted the idea and the design in May 2011, so it has been going for roughly 2.5 years now. The first prototype was completed in Nov 2011. We improved our designs over iterative feedback from users. The current system is the prototype version 3”, he added.
Talking about the extensive research for coming up with this effective solution, Abhinav Saksena, member of the team explained, “The development of such indoor navigation system posed many challenges for the team. We did an extensive research and followed a holistic development model applying insights from social and psychological dimensions into technology development. The iterative multi-centric user studies were necessary to draw conclusions and improve our user-centered designs.”
How it works
Roshni system has three components wall modules, a waist worn user module and a user interface which is based on a mobile phone application providing the directions and assisting the user to navigate inside a building. A network of wall mounted infrared sensors and accelerometers are installed on multiple floors of the building. Waist worn user module consists of IR sensors which interact with the wall modules and accelerometers to count the number of steps and an integrated bluetooth module to interact with the mobile application. The system can be easily installed in the buildings with minimum additions to existing infrastructure and the user can obtain directions to the concerned position, orientation and navigation just by pressing a key on the mobile unit, via audio messages.
Upon entering the building, the user turns on the waist worn unit and connects it to the mobile phone. Then the map of the building is downloaded in the mobile phone application. The map is in the form of an undirected graph with all the access points and meeting points of two corridors as nodes and paths between the nodes represent edges. The system follows Dijkstra’s algorithm and guides the user through the most convenient path to the destination. Convenience is preferred over path length, as user will prefer elevator instead of stairs even if the latter may provide a shorter path length.
User enters the destination in the mobile application via a QWERTY keypad. The application has an auto-complete feature to assist the user in entering the destination, it automatically predicts the text with the initial characters entered if there is only one destination present with those initial characters. User can any time change the destination by Cancel the Travel option and get updates about the current location using More Information on the Travel option. User gets all the information in form of audio messages generated using a text-to-speech engine.
As the user walks, user module constantly send updates to mobile application about the current position of the user and the number of steps data, dynamically calculating the path to the destination using the algorithm. Simultaneously, the user is also updated about the navigational directions to the next access point (viz. Intersection, T point, lift etc.) i.e., turn to take (left/right/straight) and the number of steps to travel. After reaching that access point, the user is again directed to the next access point and this process is continued till he/she reaches the destination. Wall mounted modules of IR sensors have integrated buzzers which beep when the user reaches his/her destination. Also, if the user takes a wrong turn, or deviates from the path, the application warns the user and the path is recalculated from the user’s current location to his/her destination. The device gives the instructions not only during wrong turns but also in the case of wide corridors and all this continuous confirmation of being on right path is conveyed through the vibrational alerts.