Loading
Pedro, a formerITMNDLLMmndYour browser does not support the element. security specialist, was forced to quit his job in 2021 when his motor-neurone disease (), a neurodegenerative condition, worsened. He can no longer get around without the assistance of his wife or carer and is largely non-verbal.But when Pedro speaks, his lightly accented English flows with ease. His voice is generated by an artificial-intelligence model, trained on clips recorded before he lost his speech. His words, too, are generated, by a fine-tuned on his writing.A smartphone app sits between Pedro and his interlocutors, transcribing what they say. It then generates three possible responses for him, playing them through headphones one at a time. A monitor resembling a sweatband sits on his forehead, waiting for an eyebrow twitch that he uses to select a response. The eyebrow is one of the last muscles over which most patients lose conscious control.That’s not to say the system is perfect. A conversation crammed into multiple-choice options is a frustrating limitation; at times, Pedro grimaces at the limited responses available. When trying to explain how long he had lived in Lisbon, the most accurate answer correctly said where he had moved from, but claimed not to know when—to the mock horror of his wife.The system, dubbed Halo, is a project at Unbabel, a tech company based in Lisbon. Pedro has learned to use eye-tracking software to control a computer, and for nuanced thoughts it is still his preferred method. But the hardware required is bulky, and every time he moves it must be recalibrated from scratch. In contrast, the Halo band can be worn on the go, in the car or even in a bath, giving him speech where he had none. And the system is modular: when using eye-tracking to type, he says, he prefers to use the Halo voice—his own voice—to speak the answers.