DURHAM – When a person is brought into the emergency room suffering from a suspected stroke, they undergo a barrage of tests before a diagnosis is made. The most significant of these tests involves capturing images of the patient’s brain using either a computerized tomography (CT) scan or a magnetic resonance imaging (MRI) machine.
While both of these tools can take accurate and high-resolution images, any movement from a patient can muddy the entire picture, making them less-than-ideal for use on children or people suffering from movement disorders. These devices are also extremely expensive, which puts them out of reach for hospitals or clinics with limited resources.
But the Beyond the Horizon project, “Real-Time 3D Functional Human Brain Imaging via Smart Epidermal Photoacoustic Tomography,” aims to change this paradigm. Xiaoyue Ni, an assistant professor of mechanical engineering and materials science, and Junjie Yao, an associate professor of biomedical engineering, will use their combined expertise in materials science and bioimaging to develop a low-profile, wearable imaging tool that can be attached directly to a patient’s scalp to rapidly capture accurate, detailed, 3D brain images.
“Wearable imaging devices have been developed before, but they are typically for places where there is a lot of soft tissue. Imaging through bones, like the skull, disrupts a lot of imaging tools, which is why we traditionally rely on MRI and CT scanners for brain imaging,” explains Ni.
Instead, the team’s approach involves a technique known as photoacoustic tomography, which involves shooting a safe laser beam of light into tissue. This light is absorbed by molecules, which heat up and send out an ultrasonic wave. These sound waves are then picked up by sensors and used to make detailed biomedical images of the targeted tissue. While these waves can still be disrupted by the skull, Yao and his collaborators have devised ways to measure and filter out the skull’s effects, enabling them to see into the brain.
“Photoacoustics is very sensitive to blood pumping through deep tissue, so it’s a great tool for tracking blood flow and oxygen levels in the brain,” says Yao. “And because it relies on the light absorption properties of tissue and molecules, we don’t need to use reagents, like in fluorescence imaging, or expose a patient to radiation, like we do with CT scans.”
While Yao and his lab adapt the photoacoustic technology for the project, Ni will develop the soft device platform that will house the grid of sensors that will detect the returning ultrasound waves.
“The key challenge with the platform is that once you use a soft array or something that is bendable and stretchable to conform to the head, you run the risk of losing track of the position of the transducers,” says Ni. “Everyone’s head has a different shape. We needed to find a way to ensure that we knew the location of all the sensors on the scalp all the time. If we don’t know where the sensors are in space, then we can’t get accurate images.”
To resolve this issue, Ni is developing a sensor that can be integrated to the wearable imaging device to sense the shape of a patient’s head. If the team is successful, their tool would be an easy-to-use, non-invasive, more accessible and affordable alternative to the current medical standards.
“If a stroke happens once, it can easily happen again,” says Ni. “A tool like this would allow physicians to monitor a patient as they move freely and potentially make it easier to detect a second stroke before it happens.”
(C) Duke University