Wednesday, January 19, 2011

CS Senior Project Proposal: Earfingers

Within this milieu of stupendously sophisticated technology and virtual reality, humanity has indubitably extended the boundary of what is possible effectively beyond limits set by imaginations all but exceptional. The aim of this project is to tax this overabundance in order to produce a novel tactile human/computer interface with the potential to enable an unprecedented interaction with information, and a chance at evoking a new experience of existence through the manipulation of sensation mechanisms; that is, to possibly feel something that has never been felt before.

There is no question regarding the significance and success of the human brain, exemplified by artifacts such as smartphones that are capable of performing feats essentially indistinguishable from magic. It is remarkable we are able to interpret and utilize information so effectively when the process depends entirely on just five low-resolution information pathways better known as senses. Recognizing the inherent limitations in these senses and manipulating our environment in effort to circumvent them has continually and directly expanded our ability to understand existence---for instance the development of the optical microscope immediately led to the incredibly profound revelation that all life is composed of legions of fundamentally similar cells. Despite over three hundred years of this knowledge in the public domain it remains practically inconceivable, as what we think we know more than anything else, of ourselves and of others, is based only on something vaguely real, each of us being emergent properties of a cloud of many Trillions of individual cells instead of the one continuous thing we imagine. That conscientious enhancement of sensation was necessary, and that we are able to operate oblivious to our true nature as an Astropolis of individuals indicate that our behavior is at least informed by sensation; had the ability to distinguish and accurately record individual cellular behavior been a sixth member of our sensory repertoire, it is certain that biology would be extraordinarily more advanced than at present, and it is likewise certain that both the way we experienced existence and the way we interacted as a society would be fundamentally different. For another example imagine that we had the ability to sense blood flow in another brain (like functional magnetic resonance imaging): instead of having once learned the abstract fact that localized hemodynamic response can indicate something relatively specific like ongoing hunger/hunting drive, we could use our sense and proven intellect to determine that what is apparently a log at the watering hole according to five senses is actually a hungry crocodile with information from the sixth. Clearly then sensation doesn't just inform behavior, but in many important ways defines boundaries for it by serving to enumerate possible outcomes. Unfortunately my project is not to create a portable, affordable FMRI; instead we have gone so far afield to illustrate just how significant sensory experience is. That we have but five information pathways is a fact not readily malleable (until we start mechanically re-engineering the nervous system), and this is a major disappointment in light of the significance of sensation and the potential of supernatural sensation to redefine our experience. The objective of this project is essentially to experiment with several softhacks (no rewiring necessary) on the nervous system based on what is known about how it operates. Instead of trying to stimulate novel neurological information and make it perceptibly informative, I will try to feed known neurological information down pathways intended for other information in hope that the signal will nonetheless be at least partially recognized. In particular, I will be attempting to induce the perception of sound through the tactile and visual senses.

The majority of sound is perceived through a single information pathway into the cerebrum; this in itself is not surprising, but that doesn't necessarily mean this information can't enter through another pathway. It's a well known result of neuroplasticity that any region of the brain not receiving the intended sensory input may be used for the processing of other senses, a phenomenon observed in the utilization of the occipital lobe in the congenitally blind. There's evidence that tactile sensation of vibration excites activity in the auditory cortex of the normal human encephalon. There's another neurological relationship between auditory and tactile information in the functional limitation of operational frequency, in other words, the neurons can only convey an action potential (assumed to operate ultimately as a decision problem) from approximately 20-1000 Hz due to the time it takes to pump ions across the cell membrane to recharge. The optimum response frequency, where the action potential is in synch with the stimulation source, of the Pacinian and Meissner mechanoreceptors corresponds with the frequency at which the large outer surface of the cochlea responds in synchrony---both from about 50 to 200 Hz. The cochlea responds to stimuli up to around 20 KHz by taking advantage of signal aliasing and the tendency for higher frequencies to travel deeper into its coiled structure to discretize the soundwave into what is called a tonotopic mapping. Since the cochlea has about 3600 total receptors, the signal being sent to the brain is more or less a 6 ms frame length frequency decomposition into 3600 dimensions with an indicator function if ignoring amplitude. Theoretically, then, reproducing the neurological signal precisely would require as many tactors. The specific spatial distribution of mechanoreceptors remains to be found, otherwise it's generally figured as more on the fingertips than on the torso.

For my project I intend to create the experimental hardware and software in effort to reproduce sound information through the tactile and visual pathways. The tactile information will be generated by moving magnet voice coil tactors (a portmanteau of tactile and motor), driven by an Arduino receiving FFT results from a PC. I intend to have a minimum of 10 tactors, which is more relevant than the upper boundary of 3600 per hand. Scaling the number of tactors will necessitate efficiently encoding the addresses and data over the serial bus so the Arduino can stay in synch, which can be avoided as the 16 MHz clock of the ATmega328 could signal quite a few receptors at 200 Hz. Then there's the delectable sundry mechanical quandaries, which will need be overcome one way or another. For visual stimulation, I will utilize the common, powerful display technology and seek to create a program that presents a neurologically meaningful visual representation of sound, trying to account for the idiosyncrasies of the visual system such as the relatively small area of high resolution on the retina, saccades, and distribution of rods/cones. Whether this concert of stimulation will have any effect remains to be seen, but even no effect on those born with normal audition nonetheless leaves the possibility of eliciting the perception of sound for the congenitally deaf, and even just the chance of enabling someone to experience music for the first time, the possibility of being able to share the uniquely human celebration that is music is justification enough for my efforts. Because this project is experimental in nature, and, excitingly, the results unknown, I think the single best indication of success will be that the people who evaluate what I have done are impressed by it.

1 comment:

Tom said...

I really like this idea, far more interesting than stuff like the brain-port.