Digital latency measures enhance early detection of cognitive decline, revealing neurodegenerative changes before traditional tests indicate impairment.
Response latency measures captured during digitally administered cognitive testing may reveal neurodegenerative changes long before traditional neuropsychological scores indicate impairment, according to a recent review of research published in the Journal of Alzheimer's Disease.1
The accumulating evidence evaluated in the review challenges the traditional binary framework of cognitive assessment, where an individual either passes or fails neuropsychological tests and impairment is detected only after substantial cognitive deterioration has occurred.
Even individuals meeting biological criteria for Alzheimer disease remain clinically asymptomatic for years, underscoring a need for behavioral tools that can flag emerging decline, the authors stressed.1
A rapidly evolving approach known as precision neurocognition is changing how clinicians and researchers conceptualize cognitive evaluation, David J Libon, PhD, professor in the department of geriatrics and gerontology and in the department of psychology at Rowan University, in New Jersey, and colleagues wrote. The heuristic leverages digital assessment technologies to capture subtle behavioral markers, including response timing patterns, that emerge years before conventional measures reveal deficits.1
Libon et al cite research that found even among individuals whose neurocognition test scores fall within normal limits and who achieve perfect or near-perfect accuracy, such time-based latency measures can distinguish between cognitively healthy individuals and those at risk for mild cognitive impairment (MCI) or dementia.2 Libon et al believe the approach reflects a new class of sensitive, process-oriented biomarkers that could revolutionize early detection and intervention strategies.
The review draws on recent studies using digital versions of traditional tests, including the Backward Digit Span Test (BDST), Philadelphia Verbal Learning Test (PrVLT), and Digital Clock Drawing Test (DCTclock). In BDST studies with memory clinic patients, while total scores were often perfect, response timing revealed critical differences: individuals without MCI showed longer latencies early in tasks (indicating greater cognitive preparation), while MCI participants showed slower mid-task responses (reflecting difficulty maintaining mental set). Despite 100% accuracy, MCI participants required approximately 0.75 seconds longer than controls for third digit responses.3
The authors review PrVLT analyses demonstrating that latency to recognize target words distinguished between normal cognition, amnestic MCI, dysexecutive MCI, and dementia groups with similar accuracy scores.2 Regression models linked slower recognition latency with impaired episodic memory, executive dysfunction, and language deficits, suggesting that correct recognition paired with longer latency is more informative than test scores alone.2 Similarly, DCTclock captured diagnostic value through intracomponent latencies, eg, delays between completing the clock face and beginning hands, which were significantly prolonged in MCI and associated with working memory, processing speed, and language deficits.4 In an analysis of more than 2,000 participants in the Framingham Heart Study, digital drawing latency scores correlated with Alzheimer disease polygenic risk even among cognitively normal older adults.5
In a study from 2024, Libon et al found time-based metrics were also linked to difficulties wtih instrumental activities of daily living (IADL).6 Digital Trail Making Test studies showed that subtle latency changes, including longer pauses and more pen strokes, clearly differentiated individuals with mild IADL impairment from those without, strengthening evidence that latency metrics serve as early behavioral flags for functional decline before conventional dementia criteria are met.6
In the current review, Libon and his colleagues caution that these latency-based biomarkers have yet to be validated in longitudinal studies. They also note that aging itself affects processing speed, making it essential to distinguish normal variability from pathological change. Still, they posit that traditional binary scoring systems obscure critical signals heralding cognitive decline that time-based parameters can detect.
Precision neurocognition, they stress, is a methodology-agnostic framework that can be applied through various digital tools, ranging from latency analyses to intraindividual variability to speech signal processing. Libon et al look forward to an era of personalized and preclinical detection, where behavior, not just biomarkers, guides earlier intervention for Alzheimer’s disease and related dementias.
"[D]igital assessment technology clearly deserves a seat at the table along with other existing technologies to help mitigate the malignant impact of AD and other dementia illnesses," they concluded.