Technology makes it possible to more effectively understand subtle dynamics of human body, say the eye movements. Is the student having difficulties in learning? Is the driver distracted by a phone call, or drowsy now? Is the shopper interested in a product?
The eyes are speaking out — as long as we can understand it! Wearable computer with AI learning capabilities can make this happen. The wearable computer now can capture the eye movements unobtrusively, through a small device attached to the eye area and fixed on the ear. With Artificial Intelligent, the tiny eye movement signals can be analyzed and understood.
This is an exciting progress, since the new system can open up possibilities for many applications: from cognitive learning load measurement for students and generally all workers, to distraction and drowsiness detection for drivers, to market and consumer analysis in groceries and marts. Furthermore, eyes can send commands, to allow disabilities to control the wheelchairs, or to control appliance in our home! Dr. Qingxue Zhang, Electrical and Computer Engineering Department at joint campus of Indiana University and Purdue University thinks, the technology should help human to live a smarter, heathier life.
One experiment that Dr. Zhang’s lab has conducted and showcased the potential of the eyeSay system, is an eye writing function that is expected to provide ALS patients a possibility to speak. ALS, or amyotrophic lateral sclerosis, is a progressive neurodegenerative disease that affects nerve cells in the brain and the spinal cord. In short, the brain loses connections with the muscles, or they are ‘locked-in’. So ALS patients may lose ability to speak, write, move, and other muscular functions. Unfortunately, around 450,000 people worldwide live with ALS and someone is diagnosed with ALS every 90 minute. But many patients can still move their eyes!
The Ph.D. Student, Zou, supervised by Dr. Zhang, is studying a new technology, aiming to marry wearable computer with AI for eye movement perceptron and analytics. A prototype system has been developed, which can now robustly detect 150 different eye-written words, evaluated on a 4500-trial experiment. This research progress will be published soon by the Top1 conference in Consumer Electronics area — the 39th International Conference in Consumer Electronics (IEEE ICCE) in 2021, which is held annually in Las Vegas in January just before the Global CES Trade Show and provides leading insights through many pioneers like Steve Wozniak who co-founded Apple in 1976 with Steve Jobs.
The figure given illustrates how the whole AI-powered wearable system works and can potentially facilitate many areas. The team has leveraged the cutting-edge AI technology — deep learning, to create an artificial neural network that has millions of neurons for information analysis and abstraction. The learned hierarchical patterns are encoded in neural synapses which then automatically decode similar patterns when seeing new eye movement signals, thereby revealing the meaning behind the eye signal.
Dr. Zhang’s lab — Ubiquitous Embedded Intelligence, is continuing this project and plans to advance eyeSay to the next stage of enhancement and validation. This research update reflects how AI-powered wearable computer can promisingly bring new possibilities, such as voice-free communication, attention tracking, cognitive load measurement, driver state monitoring, human-computer interactions, smart home interactions, and virtual or augmented reality.