Educational Tactile Audio Gallery (Edu-TAG)

  • Edu-TAG is a device helps the visually impaired in understanding essential tactile diagrams by making them touch sensitive and using audio feedback.
  • Has been demonstrated to visually impaired students and at IEEE Sensors 2019. Overall feedback has been good, and plans are being made to develop a production-ready product.


  • Online writing platform which facilitates hassle free classroom teaching.
  • Real-time saving of notes which enables students to focus on the teaching rather than writing notes. Notes are saved in a structural paradigm allowing easy access anytime.
  • Developed the back-end and handled integration with the front-end. Currently being used in IIIT Bangalore.

Dialect Classification

  • Designed a system which classifies nine distinct British dialects (IViE speech corpus) based on acoustic properties of speech signals.
  • MFCC and spectral flux properties were extracted to construct the feature space.
  • Machine Learning techniques such as SVM, Random Forest, kNN classifiers have been used for constructing and evaluating the classification.

IR Based Eye Blink Detection

  • The goal of this project is to detect eye blink sequences with an IR sensor.
  • Each of the detected sequences is transmitted through the internet to an Android application which performs a specific task based on the received sequence.
  • The project involved forward error correction (FEC), sensor programming, IoT programming and app development.


  • A. Ramesh, N. Raj, T. K. Srikanth and M. Rao, ”Design of a tactile audio gallery for visually impaired students,” 2019 IEEE SENSORS, Montreal,QC, Canada, 2019, pp. 1-4, doi: 10.1109/SENSORS43011.2019.8956886.
  • N. Raj, A. Ramesh, T. K. Srikanth and M. Rao, ”Live Demonstration : A tactile audio gallery for visually impaired students,” 2019 IEEE SENSORS,Montreal, QC, Canada, 2019, pp. 1-1, doi: 10.1109/SENSORS43011.2019.8956527.