Covering Scientific & Technical AI | Monday, December 2, 2024

Deep Learning Being Used to Detect Earliest Stages of Alzheimer’s Disease 

via Shutterstock

The rise of precision medicine is being augmented by greater use of deep learning technologies that provide predictive analytics for earlier diagnosis of a range of debilitating diseases.

The latest example comes from researchers at Michigan-based Beaumont Health who used deep learning to analyze genomic DNA. The resulting simple blood test could be used to detect earlier onset of Alzheimer’s disease.

In a study published this week in the peer-reviewed scientific journal PLOS ONE, the researchers said their analysis discovered 152 “significant” genetic differences among Alzheimer’s and healthy patients. Those biomarkers could be used to provide diagnoses before Alzheimer’s symptoms develop and a patient’s brain is irreversibly damaged.

“The holy grail is to identify patients in the pre-clinical stage so effective early interventions, including new medications, can be studied and ultimately used," said Dr. Ray Bahado-Singh, a Beaumont Health geneticist who led the research.

The need to identify the early signs of Alzheimer’s disease grows as the global population ages. For example, the annual World Alzheimer Report estimates 75 million will be stricken by 2030. Researchers are working to prevent some of those predicted cases by leveraging new deep learning tools to accelerate the diagnoses of a disease that often goes undetected until it is too late to stop the damage.

The Beaumont researchers said they used deep learning and other machine learning platforms along with “genome-wide” DNA analysis of leukocytes, a type of blood cell manufactured in bone marrow and associated with the body’s immune system.

“We used and compared conventional machine learning and deep learning classification algorithms which typically begin with an established set of data … and a certain understanding of how that data is classified” as either Alzheimer's or healthy patients, said co-investigator Buket Aydas, analytics manager at Blue Cross Blue Shield of Michigan.

“These algorithms are intended to find patterns in data that can be applied to an analytics process,” Aydas added in an email.

The researchers compared the performance of their deep learning framework with five other machine learning algorithms, including a prediction analysis tool. The six platforms scanned about 800,000 changes in the leukocytes genome.

The deep learning algorithm performed best.

“We also found out the important genetic features that contribute most to the [deep learning] prediction and were able to predict the absence or presence of Alzheimer’s by the help of these important genetic features,” Aydas said.

The genetic analysis ultimately predicted either the absence or presence of the disease, “allowing us to read what is going on in the brain through the blood,” Dr. Bahado-Singh said.

One problem encountered by the investigators was “overfitting,” which occurs when data sets fit a machine learning too precisely. Counterintuitively, the snug fit often produces unreliable results.

To avoid overfitting in the deep learning framework, the researchers said they employed standard parameters to tune models and overcome the overfitting problem.

The researchers said the next step is an expanded study over the next year designed to replicate the initial findings of the Alzheimer's analysis. Advances in this branch of precision medicine could lead to development of targeted treatments to “interrupt the disease process,” according to Dr. Bahado-Singh.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

AIwire