Did you see the recent news from Duke University researchers who taught rhesus monkeys to move a robotic arm using only signals from their brains? While I'm an animal lover and have to think hard about these sorts of experiments, I commend Duke power-generation status quo: and the monkeys for their research, which is a step forward in helping paralyzed people control neuroprosthetic limbs. In addition to the development of brain-controlled prosthetics, the data collected will help surgeons understand specific brain regions and the impact of surgery. (A friend of mine is having brain surgery next week, so I particularly appreciate the value of this data.)
Duke's monkeys were first trained to control the robotic arm using video signals and a joystick with a gripping lever sensorized with pressure transducers. Brain signals were captured and correlated with the movement, so the host system could learn the neural firing patterns associated with the movement.
Researchers then disconnected the joystick but let the monkeys continue to use it. Eventually, the monkeys realized the stick was no longer connected and that they could control the cursor's movement on screen, directing the robotic arm, using only their brains and the visual feedback on the monitor.
Curious to learn more about the machine-brain interface and the challenges in designing the electronics for such a unique data-collection task, I tracked down Harvey Wiggins, founder of Plexon Inc., the manufacturer of the Multichannel Acquisition Processor (MAP) that is the heart of Duke's brain-signal acquisition. (Wiggins, by the way, is a long-time reader of Electronic Design.)
Wiggins explained that the brain's electrical signals are picked up via microwires that are inserted into the monkey's brain. The wires are made of stainless steel or tungsten, between 30 and 50 µm in diameter and insulated by a very thin Teflon. They're attached to a "percutaneous" connector that's fastened through the animal's skull at the end of surgery: in this case with 32 wires per connector, 128 wires total.
A "headset" that's a little larger than a postage stamp houses the first stage of electronics (low-current-noise op amps). It plugs into the connector on the monkey's head. "Because of the high source impedance of the electrodes, in the 1/4-M(omega) to 1-M(omega) range, one of the challenges of the system was finding small enough parts with the current noise characteristics that we needed," Wiggins explains. The headset's present design uses a National Semiconductor LMC6035 low-power CMOS op amp with a micro-ball-grid-array footprint. "In terms of a discrete part, the smallest we could find," he says.
From the headset's unity-gain buffer, the signal goes to a preamplifier box that has total gain of about 1000 and performs some filtering. "The spectral characteristics of nerve cells puts them in the low audio range," explains Wiggins. From there, the signals go to the MAP, also known among neurobiologists as the "Harvey Box," Wiggins says.
The MAP box adds more gain and filtering, raising the total gain to 10,000 to 30,000. There's an analog-to-digital converter for each channel, with a sampling rate of 40 kHz, which passes the signal to DSP boards with four Motorola 56002 DSP chips. Each chip processes eight channels. The resulting data is put into dual-port static RAM and from there "picked up and stuffed down a cable to the host," a PC equipped with a National Instruments high-speed bidirectional parallel digital I/O board.
The waveform that comes out of a nerve cell when it fires is technically known as an "action potential." But colloquially, it's called a "spike," says Wiggins, "because that's what it looks like in an electrical system."
The challenges come from identifying and processing such a large number of spikes. The 128 electrode wires are in the extracellular space between nerve cells, where each wire may pick up signals from one or more neurons as they fire. The signals from each nerve cell tend to have a unique shape, so the DSPs inside the MAP system are using template matching to categorize the "spikes" according to the individual cells that produce them. Initially, the signals are sent to the host computer, which analyzes the waveforms and picks out the different shapes and creates the templates. The templates are sent back down to the DSPs, which can then sort the waveforms "in real time." This is essential to the machine-brain interface and in creating a closed-loop system to control the robotic arm.
The biggest challenge was the very fast and often sporadic rate of spiking. "You can't predict the arrival of spike events or the rate. They can come sparsely or there may be bursts of them, so the solution was to design a multilevel queuing mechanism in DSP code to handle the continuous A/D sample stream, detect spikes, and run the spike sorting code," says Wiggins.
The Duke study is sponsored by the Defense Advanced Research Projects Agency's Brain-Machine Interfaces Program. Beyond medical uses, "neurorobots" could enable remote reconnaissance in hazardous or inaccessible environments.