DEMONSTRATION OF CLOSED-LOOP STIMULATION PLATFORM WITH HARDWARE ACCELERATED ARTIFACT REMOVAL

The lab has developed and validated a closed-loop stimulation platform capable of updating the stimulation montage at intervals as brief as 10 milliseconds, accompanied by a latency of only 20 milliseconds. However, the frequent updates to the montage generate biomimetic and varying electrical artifacts, thereby obscuring the neural response. Our research team has designed a proprietary integrated chip that leverages the spatial correlation inherent in the artifacts to facilitate hardware-accelerated removal. It enables uninterrupted, real-time monitoring of neural responses during active stimulation.

The lab member is validating the recording system’s capability to record emulated neural signals concurrently with biomimetic stimulation, specifically dynamic stimulation. The upper trace displays on the screen represents the raw input data, which includes stimulation artifacts and an oscillatory signal representing the underlying neural response. The bottom trace displays the neural response extracted by the proprietary recording system.

Demonstration of Closed-loop Operation with Our Implantable Device

 

The blue and pink traces on the oscilloscope represents two of the many channels that are generating stimulation pulses, initially at 30Hz and 60Hz at different phase delays respectively. The green trace represents ADC recording that is wirelessly transmitted to a laptop running the closed-loop controller in real-time. Toggling the device’s analog recording sensor, the closed-loop controller on the laptop detects the change and modulates the stimulation protocols, communicating this back to the SoC chip wirelessly. The last portion of the video shows the new stimulation patterns of the blue and pink traces, at 30Hz and 20Hz, and phase delays respectively. In summary, we have demonstrated our system’s capability to simultaneously perform real-time recording and stimulation. Accordingly, it is feasible to implement closed-loop operations using our implantable system. A sophisticated controller is under development.

Click to view the Demo

The figure displays a demonstration of our closed-loop system performing real-time, multi-channel recording and stimulation. The green and yellow traces are EMG signals fed into the recording sensors of the chip. The information is transmitted wirelessly as input to a control algorithm that detects EMG envelopes over the chip’s multiple recording channels. The decision algorithm modulates the chips stimulation parameters to 60Hz (blue trace) and 20Hz (pink trace) when a large envelope is detected in the yellow EMG signal; and 20Hz (blue trace) and 30Hz (pink trace) when a large envelope is detected in the green EMG signal.

 

Successful Validation of Bi-directional Communication for Our Implant SoC

Link to Demo Video

We have successfully tested and validated the bidirectional communication links (forward and reversed) for our implant SOC and system designed for the restoration of motor functions for the spinal cord injury (SCI). The stimulation commands are issued via forward link(cyan trace) while the response to the commands is digitally transmitted and received by the customized Graphic Interface Unit (GUI) via the reversed link (green trace). The command is capable of simultaneously targeting various spinal cord segments and thus the corresponding muscles at the lower limbs by different set of waveform parameters, including frequency, skew, duration, amplitude, phase, etc. (yellow and pink traces). This is a critical step toward the realization of a closed-loop system for SCI applications.

NSF news: Artificial Retina Receives FDA Approval

February 14, 2013

From: https://www.nsf.gov/news/news_summ.jsp?cntn_id=126756

The U.S. Food and Drug Administration (FDA) granted market approval to an artificial retina technology today, the first bionic eye to be approved for patients in the United States. The prosthetic technology was developed in part with support from the National Science Foundation (NSF).

The device, called the Argus® II Retinal Prosthesis System, transmits images from a small, eye-glass-mounted camera wirelessly to a microelectrode array implanted on a patient’s damaged retina. The array sends electrical signals via the optic nerve, and the brain interprets a visual image.

The FDA approval currently applies to individuals who have lost sight as a result of severe to profound retinitis pigmentosa (RP), an ailment that affects one in every 4,000 Americans. The implant allows some individuals with RP, who are completely blind, to locate objects, detect movement, improve orientation and mobility skills and discern shapes such as large letters.

The Argus II is manufactured by, and will be distributed by, Second Sight Medical Products of Sylmar, Calif., which is part of the team of scientists and engineers from the university, federal and private sectors who spent nearly two decades developing the system with public and private investment.

“Seeing my grandmother go blind motivated me to pursue ophthalmology and biomedical engineering to develop a treatment for patients for whom there was no foreseeable cure,” says the technology’s co-developer, Mark Humayun, associate director of research at the Doheny Eye Institute at the University of Southern California and director of the NSF Engineering Research Center for Biomimetic MicroElectronic Systems (BMES). “It was an interdisciplinary approach grounded in biomedical engineering that has allowed us to develop the Argus II, making it the first commercially approved retinal implant in the world to restore sight to some blind patients,” Humayun adds.

The effort by Humayun and his colleagues has received early and continuing support from NSF, the National Institutes of Health and the Department of Energy, with grants totaling more than $100 million. The private sector’s support nearly matched that of the federal government.

“The retinal implant exemplifies how NSF grants for high-risk, fundamental research can directly result in ground-breaking technologies decades later,” said Acting NSF Assistant Director for Engineering Kesh Narayanan. “In collaboration with the Second Sight team and the courageous patients who volunteered to have experimental surgery to implant the first-generation devices, the researchers of NSF’s Biomimetic MicroElectronic Systems Engineering Research Center are developing technologies that may ultimately have as profound an impact on blindness as the cochlear implant has had for hearing loss.”

Although some treatments to slow the progression of degenerative diseases of the retina are available, no treatment has existed that could replace the function of lost photoreceptors in the eye.

The researchers began their retinal prosthesis research in the late 1980s to address that need, and in 1994 Humayun received his first NSF grant, an NSF Young Investigator Award, which built upon additional support from the Whittaker Foundation. Humayun used the funding to develop the first conceptualization of the Argus II’s underlying artificial retina technology.

Since that time, he and his collaborators–including Wentai Liu of the University of California, Los Angeles and fellow USC researchers Jim Weiland and Eugene de Juan, Jr.–received six additional NSF grants, totaling $40 million, some of which was part of NSF’s funding for BMES, launched in 2003. BMES drives research into a range of sophisticated prosthetic technologies to treat blindness, paralysis and other conditions.

“We were encouraged by the team’s exploratory work in the 1980s and 1990s, supported by NSF and others, which revealed that healthy neural pathways can carry information to the brain, even though other parts of the eye are damaged,” adds Narayanan. “The retinal prosthesis they developed from that work simulates the most complex part of the eye. Based on the promise of that implant, we decided in 2003 to entrust the research team with an NSF Engineering Research Center,” says Narayanan. “The center was to scale up technology development and increase device sensitivity and biocompatibility, while simultaneously preparing students for the workforce and building partnerships to speed the technology to the marketplace, where it could make a difference in people’s lives. The center has succeeded with all of those goals.”

The researchers’ efforts have bridged cellular biology–necessary for understanding how to stimulate the retinal ganglion cells without permanent damage–with microelectronics, which led to the miniaturized, low-power integrated chip for performing signal conversion, conditioning and stimulation functions. The hardware was paired with software processing and tuning algorithms that convert visual imagery to stimulation signals, and the entire system had to be incorporated within hermetically sealed packaging that allowed the electronics to operate in the vitreous fluid of the eye indefinitely. Finally, the research team had to develop new surgical techniques in order to integrate the device with the body, ensuring accurate placement of the stimulation electrodes on the retina.

“The artificial retina is a great engineering challenge under the interdisciplinary constraint of biology, enabling technology, regulatory compliance, as well as sophisticated design science,” adds Liu. “The artificial retina provides an interface between biotic and abiotic systems. Its unique design characteristics rely on system-level optimization, rather than the more common practice of component optimization, to achieve miniaturization and integration. Using the most advanced semiconductor technology, the engine for the artificial retina is a ‘system on a chip’ of mixed voltages and mixed analog-digital design, which provides self-contained power and data management and other functionality. This design for the artificial retina facilitates both surgical procedures and regulatory compliance.”

The Argus II design consists of an external video camera system matched to the implanted retinal stimulator, which contains a microelectrode array that spans 20 degrees of visual field. The NSF BMES ERC has developed a prototype system with an array of more than 15 times as many electrodes and an ultra-miniature video camera that can be implanted in the eye. However, this prototype is many years away from being available for patient use.

“The external camera system-built into a pair of glasses-streams video to a belt-worn computer, which converts the video into stimulus commands for the implant,” says Weiland. “The belt-worn computer encodes the commands into a wireless signal that is transmitted to the implant, which has the necessary electronics to receive and decode both wireless power and data. Based on those data, the implant stimulates the retina with small electrical pulses. The electronics are hermetically packaged and the electrical stimulus is delivered to the retina via a microelectrode array.”

In 1998, Robert Greenberg founded Second Sight to develop the technology for the marketplace. While under development, the Argus I and Argus II systems have won wide recognition, including a 2010 Popular Mechanics Breakthrough Award and a 2009 R&D 100 Award, but it is only with FDA approval that the technology can now be made available to patients.

“An artificial retina can offer hope to those with retinitis pigmentosa, as it may help them achieve a level of visual perception that enhances their quality of life, enabling them to perform functions of daily living more easily and the chance to enjoy simple pleasures we may take for granted,” says Narayanan. “Such success is the result of fundamental studies in several fields, technology improvements based on those results and feedback from clinical trials–all enabled by sustained public and private investment from entities like NSF.”

For more, please see an NSF Science Nation video on the Argus I technology, and read more about the early stages of development for both devices in this feature story.
https://www.nsf.gov/news/news_videos.jsp?org=NSF&cntn_id=126756&media_id=73830

-NSF-

Brain to Brain Communication

Our Brain-to-Brain communication research focuses on advancing the precision of non-invasive recording and stimulation of the brain cortex. The precision is increased by the development of advanced optimization algorithms combined with accurate tissue modeling. The models are augmented by our development of hardware stimulation and recording systems to allow independent fine-tuned stimulation of large amount of channels and fast data transmission between multiple brain-interface devices in real-time. Our bold aim is to step beyond a brain-computer interface and instead make human brain to brain connection possible, thus making scientific fiction a reality.