Highlight
– Older adults with postlingual hearing loss show preserved early agent-based semantic prediction but delayed verb-driven prediction during sentence comprehension.
– The delay is greater when listening demand is higher, implicating increased listening effort or limited cognitive resources as a mechanism.
– No group differences appeared for semantically neutral sentences or for the cost of incorrect predictions, suggesting delays reflect prediction timing rather than slower lexical access or higher cost of failure.
Background
Human listeners routinely use linguistic context to anticipate upcoming words, a process that facilitates rapid comprehension in everyday conversation. These predictive mechanisms are especially important when the acoustic signal is degraded or masked, because foreknowledge narrows lexical candidates and reduces processing demands. Age-related hearing loss (presbycusis) produces a perceptual deficit: reduced audibility and spectral/temporal detail of the speech signal. Beyond audibility, hearing loss is associated with greater listening effort and recruitment of cognitive resources to support perception. Disentangling perceptual from cognitive consequences of hearing loss is clinically important because it affects rehabilitation strategies (e.g., hearing-aid fitting, communication training) and our theoretical understanding of language processing under challenge.
Study design
Fernandez et al. (2025) used a visual-world eye-tracking paradigm to probe the timecourse of semantic prediction in older adults with and without hearing loss. The study enrolled three groups of older adults (ages 53–80): normal-hearing (n = 30), hearing loss listening under low demand (n = 32), and hearing loss under high demand (n = 31). ‘‘Hearing loss’’ here refers to postlingual sensorineural impairment; the authors manipulated listening demand for the hearing-loss participants (details in the manuscript) to model differences in perceptual difficulty beyond pure audibility.
The experimental stimuli comprised highly constraining sentences designed to support two types of predictive processing. In sub-experiment 1, agent-based associative predictions were tested: listeners could anticipate a likely object largely from the agent noun (e.g., “The gardener watered the …” may bias toward “plants”). In sub-experiment 2, prediction was further refined by the verb, assessing how quickly listeners narrowed candidates when additional linguistic constraint arrived. Neutral sentences with low semantic constraint served as controls to evaluate whether any group differences reflected general slowing of lexical access.
Eye-movement trajectories were analyzed to quantify the timing and buildup of anticipatory looks to target versus competitor images. The authors also examined behavioral costs when predictions were violated.
Key findings
Preserved early (agent-based) prediction
Across groups, listeners generated early associative predictions based on the sentence agent. The onset and early magnitude of agent-driven anticipatory eye movements were statistically similar between normal-hearing participants and those with hearing loss, indicating that simple associative prediction can remain intact despite peripheral auditory impairment.
Delayed verb-driven prediction in hearing loss, amplified by listening demand
When verb information became available and could narrow potential continuations, hearing-loss participants showed a lag in the buildup and tailoring of predictions compared with normal-hearing peers. Importantly, this delay was exacerbated in the high-listening-demand condition, suggesting that reduced perceptual clarity plus increased cognitive/resource demands slow the transition from coarse (agent-based) expectations to more specific, verb-guided predictions.
Neutral sentence control
No group differences were observed for semantically unconstraining neutral sentences, arguing against a simple explanation based on delayed lexical access across the board. If hearing-loss participants were uniformly slower to access words, one would expect delayed eye-movement patterns even in neutral contexts; instead, delays appeared selectively when prediction depended on rapid integration of unfolding linguistic cues.
No group differences in cost of incorrect predictions
The authors examined whether hearing-loss participants suffered greater penalties when predictions were incorrect (e.g., slower recovery or reduced accuracy) but found no reliable group differences. This suggests that while the timing of prediction formation is altered, the downstream ability to revise or recover from incorrect expectations may be intact in this cohort under experimental conditions.
Expert commentary and mechanistic interpretation
These results support a two-stage view of prediction during speech comprehension. The first stage is a relatively rapid, coarse associative sweep—often driven by thematic or agent information—that appears robust even with degraded input. The second stage entails rapid specification and narrowing of lexical candidates once verbs and more diagnostic cues arrive. This second stage is vulnerable to hearing loss and to increased demand because it requires faster online integration and allocation of limited cognitive resources (e.g., working memory, executive control).
Mechanistically, delays may reflect increased listening effort: listeners with hearing loss recruit additional cognitive capacity for bottom-up perceptual analysis, leaving fewer resources for top-down predictive operations. Alternatively, degraded input may increase lexical competition so that stronger evidence is required before committing to a narrowed prediction. The finding that neutral sentences did not show delays argues that basic lexical access is not uniformly slower; instead, integration-based prediction dynamics are specifically affected.
These findings align with theoretical frameworks such as the Ease of Language Understanding (ELU) model, which emphasizes the interplay between signal quality, working memory, and explicit processing when automatic matching between input and lexical representations fails (Rönnberg et al., 2013). The present study adds temporal specificity: it shows where in the predictive timecourse hearing loss exerts its effect.
Clinical and practical implications
Clinicians should recognize that hearing loss can impose not only audibility limits but also time-sensitive cognitive constraints on speech processing. In practice, this suggests several actionable points:
- Hearing-aid fittings should prioritize restoration of signal clarity and signal-to-noise ratio to reduce the cognitive load required for bottom-up perception and free resources for prediction.
- Communication strategies (e.g., speaking slightly slower, providing contextual cues, using names/agents early) may help listeners form earlier coarse predictions and provide more time to refine them.
- Rehabilitation programs could consider training that targets rapid linguistic-integration skills or cognitive domains (working memory, attention) implicated in dynamic prediction.
From a policy perspective, understanding cognitive consequences of hearing loss strengthens the argument for timely diagnosis and intervention to mitigate downstream impacts on communication, social participation, and possibly cognitive health.
Limitations and future directions
Several limitations warrant mention. Participants were older adults with postlingual hearing loss; findings may not generalize to younger listeners, prelingual hearing loss, or to more severe degrees of impairment. The laboratory visual-world task provides precise temporal measures but differs from real-world multichannel conversations with overlapping speech and rapid turn-taking. The study manipulated ‘‘listening demand’’ experimentally, but real-world demand fluctuates with social and environmental factors; ecological validation is needed.
Future work should couple eye-tracking with neural measures (EEG/MEG) to map the electrophysiological correlates of delayed prediction, assess whether amplified delays predict communicative breakdown or cognitive decline longitudinally, and evaluate whether hearing-aid signal processing can restore normal prediction timing. Interventions that explicitly train predictive language processing or reduce listening effort pharmacologically or behaviorally could be experimentally tested.
Conclusion
Fernandez et al. provide compelling evidence that postlingual hearing loss selectively delays the timecourse by which listeners refine semantic predictions as sentences unfold, particularly under greater perceptual demand. Early associative prediction based on agent information remains intact, suggesting a staged predictive architecture: a robust coarse prediction followed by a resource-sensitive fine-tuning stage. Clinically, this distinction matters because it implies that timely restoration of signal quality and strategies that reduce listening effort may preserve not only audibility but also the timing dynamics of comprehension that support fluent conversation.
Funding and clinicaltrials.gov
Funding and trial registration details were not specified in the provided article citation. Readers should consult the original publication for complete funding disclosures and any registry entries.
References
Fernandez LB, Shehzad M, Hadley LV. Effects of Hearing Loss on Semantic Prediction: Delayed Prediction for Intelligible Speech When Listening Is Demanding. Ear Hear. 2025 Nov-Dec 01;46(6):1440-1456. doi: 10.1097/AUD.0000000000001679. PMID: 40533889; PMCID: PMC12533763.
Rönnberg J, Rudner M, Lunner T, Zekveld AA. The Ease of Language Understanding (ELU) model: theoretical, empirical, and clinical advances. Front Syst Neurosci. 2013;7:31.
AI-friendly visual prompt for article thumbnail
A middle-aged or older adult wearing a discreet hearing aid seated at a café table engaged in conversation; the foreground shows semi-transparent floating word fragments and arrows converging into a single highlighted word above the talker’s mouth, with soft-focus background patrons and subtle noise icons indicating an acoustically challenging environment; warm natural lighting, realistic photographic style.

