For millennia the human experience has been governed by five senses, but advances in neuroscience and technology may soon give us a far broader perspective.
What counts as a sense in the first place is not clear cut. Sight, hearing, taste, smell, and touch make up the traditional five senses, but our sense of balance and the ability to track the movement of our own body (proprioception) are both key sensory inputs. While often lumped in with touch, our temperature and pain monitoring systems could potentially qualify as independent senses.
These senses are also not as concrete as we probably believe. Roughly 4.4% of the population experiences synesthesia — where the stimulation of one sense simultaneously produces sensations in another. This can result in people perceiving colors when they hear sounds or associating shapes with certain tastes, demonstrating the potential fluidity of our senses.
In recent years, scientists have taken advantage of this fluidity to develop workarounds for those who have lost one of their senses. The pioneering work of American neuroscientist Paul Bach-y-Rita in the 1960s demonstrated the plasticity of the human brain. He created a chair that translated a video feed into vibrations on 400 small touchpads pressed against a person’s back, which allowed congenitally blind patients to detect faces, objects, and shadows.
As Bach-y-Rita told Discover in 2003, “We don’t see with our eyes. We see with our brains.” Working on this principle, he and his lab developed a variety of sense swapping techniques. Their work culminated in the late 1990s with the Tongue Display Unit, which plays tactile patterns on the tongue to help the blind see and restore a sense of balance.
Similar principles are now helping rewire neural pathways to do things like ‘hear’ visual scenes or ‘feel’ sounds. The vOICe smart glasses designed by Dutch engineer Peter Meijer convert pixels from a video feed into sound by mapping brightness and vertical location to pitch and volume.
The Versatile Extra-Sensory Transducer (VEST), developed by David Eagleman at Baylor College of Medicine, is a vest that converts noises into vibrations that the user’s brain can learn to interpret as particular sounds. But while the VEST is aimed at helping the deaf, it is actually input agnostic; it has an open API, so it’s just as easy to input Twitter feeds, stock market data, or weather patterns.
This opens up the tantalizing possibility of not only rerouting but actually augmenting our perceptual experience with inputs previously out of reach to humans. What started as a solution to medical problems is beginning to feed into the philosophy of transhumanism, which aims to use science and technology to help us evolve beyond our current physical and mental limitations.
Artist and “cyborg” Neil Harbisson is a living example of this trend. Born with a form of color blindness that means he sees the world in greyscale, he had an antenna featuring a camera implanted in his skull that can convert colors into audible vibrations whose frequency is determined by the hue of the visual scene. He now claims to experience a form of synesthesia where he can ‘hear’ paintings, but also ascribe colors to particular sounds.
More importantly, his antenna allows him to perceive infrared and ultraviolet — frequencies beyond the normal human visual spectrum. He’s not the only one pushing the boundaries of human sensory experience. Reading University professor Kevin Warwick had an electrode array directly interfaced with his nervous system to demonstrate using his brain to control a robotic arm. In a subsequent experiment, he used the same implant hooked up to ultrasonic sensors to hear in ultrasound.
The animal kingdom is full of inspiration for budding transhumanists looking for new ways to perceive the world. Many snakes can see infrared light, giving them a form of thermal vision; several species of fish can detect electrical fields; and both bird and insect species can tap into the Earth’s magnetic fields.
Experiments on rats appear to demonstrate that these perceptive abilities aren’t species-specific. By wiring detectors up to the brains of rodents, Duke University neuroscientists have allowed them to both ‘feel’ and ‘see’ in infrared. Another group from the University of Tokyo hooked up a geomagnetic compass to the visual cortex of blind rats, which allowed them to navigate a maze as well as sighted ones.
While it’s likely to be some time before scientists dare to try anything this invasive on humans, communities of ‘biohackers’ or ‘grinders’ are taking the lead of people like Harbisson and Warwick and carrying out some hair-raising body modifications on themselves in the name of practical transhumanism.
Collectives like Grindhouse Wetwares have experimented with everything from magnets implanted in the finger that buzz in different ways when close to electrical devices to range-finding sensors that allow one to build up a picture of a room’s contours with eyes closed via vibrations.
This summer even saw the inaugural Hack the Senses hackathon take place in London, bringing together developers, designers, and neuroscientists to push the boundaries of sensory substitution and augmentation. Using the kind of gear found in any well-equipped maker’s space, participants created clips that transmit vibrations through hair to navigate and a system that analyzes social media then vibrates when someone with similar interests passes by.
While these expeditions into extra-sensory augmentation are still in their formative stages, it’s becoming increasingly obvious that the scope of human perception is broader than many supposed. It may not be too long before some pretty astounding sensory powers become part of the everyday human experience.