BioPhotonics spoke with René Heine, cofounder and CEO of Cubert, which specializes in real-time spectral imaging. The company has worked with researchers on using a novel diagnostic approach that combines snapshot hyperspectral imaging with artificial intelligence (AI) to detect spectral signatures associated with Alzheimer’s disease pathology in the retina. Some current methods for detecting biomarkers of the disease remain costly and invasive. Other researchers have noted that thinning of the retina precedes the familiar symptoms of Alzheimer’s disease. What are the limitations of current technologies in detecting this physiological change, as well as other biomarkers of the disease? Unlike other methods such as PET scans or cerebrospinal fluid markers, retinal examination — to which our technology can be integrated — is a normal part of many patients’ medical care. As people get older, they go to the eye doctor more regularly. The retina is a window into the brain. A recent prestudy by our partner RetiSpec showed that our hyperspectral technology achieved over 90% specificity in determining biochemical changes, which forms the basis for the full clinical trial in which we will seek FDA approval for the technology and eventually make it available for a broad audience. A number of companies are also looking to this technology to provide data that can monitor how effectively medications for Alzheimer’s have been dispersed. Given that hyperspectral imaging generates such large amounts of data, how does AI help to extract diagnostic information? There are between 120 and 130 layers of data in a hyperspectral image, with each pixel capturing the full spectral response of the retina. The goal of snapshot hyperspectral imaging is to measure the complete 3D datacube at the same time. This is especially important in eye exams, where the complete measurement takes no more than a few milliseconds. The information from the 3D datacube generated is then fed into the convolutional neural network (CNN) algorithm, which can learn in multidimensional space. This way, molecular-level changes can be tracked from the full range of spectral data. We’ve been working with computer scientists, and the algorithm has changed multiple times. The challenge is to find a reliable spectral output that is specific enough to what we’re looking for. But our role is not to produce a diagnosis; we provide the full range of spectral information, such as oxygenation or other factors, that the clinicians can use to their benefit. What types of collaborations are advancing this technology toward future commercialization? We have been working with RetiSpec for seven years, and jointly we have already deployed over a dozen of our cameras. They have been looking to commercialize the AI-based eye test for predicting amyloid, a key protein involved in the onset of Alzheimer’s disease. Companies such as Eli Lilly have been investing in this technology. A variety of companies and nonprofits have also been involved in the Bio-Hermes-001 and Bio-Hermes-002 studies. The former compared blood and digital biomarkers to amyloid PET scans, and the latter will gather data on each biomarker. Ultimately, the goal is for this technology to be integrated into a turnkey ophthalmoscope that can be used for diagnostic testing by general physicians.