"Speak the speech, I pray you, as I pronounced it to you, trippingly on the tongue; but if you mouth it, as many of your players do, I had as lief the town-crier spoke my lines." How about tongue braille as a novel way to read Hamlet?
The tongue, asserts Paul Bach-y-Rita, is a terrific portal to the brain. The UW-Madison physician and inventor says the tongue might serve as the ideal tactile environment to help blind people navigate, give Navy Seals directions in dim underwater environments and guide urban search-and-rescue teams as they comb the confusion of smoke-filled buildings for people to rescue.
"You don't see with your eyes, you see with your brain," says Bach-y-Rita, who, with colleague Kurt Kaczmarek, has applied for a patent on a device that uses electrical impulses to route spatial information through the tongue to the brain.
"The brain is very malleable," says Bach-y-Rita. "You can compensate for sensory loss by rehabilitating the brain" and turning to surviving sensory systems such as the skin and the tongue to substitute for lost vision.
Loaded with nerves and bathed in its own conductive saline solution, the tongue is an ideal surface for a tiny array of 144 electrodes that can, through the coordinated firing of mild electrical impulses, route images from a camera, computer or other device straight to the brain.
New miniaturized electronics, say Bach-y-Rita and Kaczmarek, will permit the device to be as small or smaller than a dental retainer and enable it to be built directly into the respirators used by divers and firefighters.
A related, tongue-based application is being developed by UW-Madison researcher Mitch Tyler to help people who have lost their sense of balance.
The technology has even caught the attention of some in the video gaming industry who see it as a bold new frontier for controlling the action of electronic gaming.
In addition to systems for the blind, Bach-y-Rita says the technology could have other applications, because designers can create impulses from any measurable source.
He is in discussions with the military regarding devices to allow divers to "see" more effectively through murky water using their mouthpieces, or to allow soldiers to receive night vision readouts through their tongues. He adds that the tongue sensors could one day be used in conjunction with video games, and his team has received a federal grant for a system that will aid people who've lost their sense of balance.
Researchers at the Naval Aerospace Medical Research Laboratory and the Institute for Human and Machine Cognition used Bach-y-Rita’s ideas to cram a pilot’s brain with expanded spatial awareness akin to sight. Instead of electrodes on the tongue, the Tactile Situation Awareness System uses a flight suit embedded with as many as 96 transducers – mini-vibrators like the ones found in cell phones. The TSAS makes pilots less dependent on their eyes. "The visual workload has gone up so high that we’re seeing an increase in the number of human factor-related mishaps," says Anil Raj, who heads the program at the University of West Florida. Now pilots can gauge their orientation from a buzz on the torso. If the plane banks left, they feel a zap on the left. If the plane makes a 180-degree turn, the zap will travel from one side of the body to the other. It usually takes months of training before pilots can look at their altimeters, attitude indicators, and compasses and understand a plane’s location in space. With TSAS, it takes 10 minutes.
|Share |||Randall Parker, 2002 November 30 10:05 AM Cyborg Tech|