Mandayam Srinivasan vividly recalls the first time he gained a window on the miracle that is our system of touch.
An engineer by training, in 1984 he was offered a “postdoc” to take part in nervous-system studies at Yale, where he’d just earned his PhD probing the complex mathematics of how the walls of structures such as oil-drilling rigs respond to varying pressures.
His new research involved exposing a monkey’s “fingers” to varying shapes and pressures, then measuring the resulting signals along the animal’s network of nerves. When the pressures were light, the signals — translated into sounds — went like this: “bup…bup…bup…bup…” When the pressures increased, on the other hand, the “bup, bup, bup’s” turned rapid-fire.
“It was exciting, almost indescribable, really, to hear these neuroreceptors fire, and be able to record them,” says Srinivasan. Focused up to then on a career in the field called theoretical mechanics, he soon changed course. “That year changed my life,” says the scientist.
After his postdoc, Srinivasan came to MIT’s Research Laboratory of Electronics, where he now heads its Laboratory for Human and Machine Haptics — or, as it’s nicknamed, “the Touch Lab.”
Those who study our system of touch believe their work holds great promise. Among potential payoffs: techniques that offer early warning of carpal tunnel syndrome, the debilitating hand-wrist malady that afflicts 300,000 in the U.S. alone; and “virtual reality” teaching tools that would let students get a direct sense of phenomena like the forces at work within atoms and molecules.
One of Srinivasan’s own goals is a system that would let future surgeons learn exactly how it feels to perform open-heart surgery, say, or manually probe a tumor mass — and do it without ever coming close to the real organs.
“What you would feel, and see on your computer screen or inside your virtual reality helmet, would be just like operating in real life,” he notes, “but you’d be dealing entirely with virtual environments.”
But there are barriers to touch-related breakthroughs. A big one is the complexity of touch itself.
When we pick up an egg, check the temperature of a baby’s bottle, test a knife’s sharpness, we don’t give what we’ve accomplished a thought. In fact, actions like these are the product of an amazingly sophisticated system for sensing what we touch, and figuring out how to change our grasp in response to those signals.
“There are about 2,000 receptors in each of our finger tips whose only role is to gauge qualities like texture, shape, and the ability to cause friction,” says Srinivasan. “There may be even more sensors for gauging warmth or coolness, and for detecting due to mechanical, chemical or heat stimuli.”
Compare this system to the most advanced robotic “fingers,” which might harbor a few dozen receptors. No wonder it’s tough to build robots that can do a seemingly simple task like picking up a drinking glass.
But the touch system is not only complex, it’s almost eerily sensitive. Srinivasan has had volunteers test their skill at telling apart differing glass surfaces. Some of the surfaces are completely flat. Others have dots or patterns, invisible to the naked eye, etched in.
“If it’s a dot,” he notes, “it has to be about 3 microns (millionths of a meter) in depth for someone to feel it.” But if you etch a texture onto the surface, it only has to be 75 nanometers, or billionths of a meter, deep — that is, roughly one one-thousandth the diameter of a human hair.
Despite the complexities of touch, the scientist is not only striving for a basic understanding of the touch system but is also exploring applications. The training system for surgeons is one. Srinivasan’s group has also built a sophisticated ultrasound microscope to probe exactly how skin layers in our fingers change shape in response to pressure — as happens in typing, for example — and is probing the system’s use in cancer diagnosis.
“Cancer changes the mechanical properties of the skin,” notes Srinivasan, “and this system may be able to detect those changes.”
Another goal is a computer the visually impaired could use. Key to it is a large number — a thousand or more — minuscule electromechanical devices built directly into a screen or keyboard.
“You could set it up so a blind person could Ôfeel’ a menu, and could input choices just as we do by clicking a mouse key,” says Srinivasan. “You can even imagine a system which allows that person to carry out a two-way online conversation using touch.”
As enthusiastic as the scientist gets about such ideas, it’s clear the excitement he felt during his first exposures to the touch system still motivates him. “To be able to study phenomena in nature that have amazing effects, like touch, and to help make sense of the system — that’s a special experience,” he says.