New York

Researchers have developed an app that could help people speak the language of eyes — literally.

The smartphone app that researchers working with Microsoft have developed can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication.

Called GazeSpeak, the app would help people with amyotrophic lateral sclerosis (ALS), a condition resulting in individuals gradually losing their strength and the ability to speak, eat or move. As part of the Enable team at Microsoft Research, the scientists developed GazeSpeak to help people with ALS who can move their eyes but cannot speak. The app will be available on the Apple App Store in May.