We examined the auditory sentence processing of neurologically unimpaired listeners and individuals with aphasia on canonical sentence structures in real-time using a visual-world eye-tracking paradigm. The canonical sentence constructions contained multiple noun phrases and an unaccusative verb, the latter of which formed a long-distance dependency link between the unaccusative verb and its single argument (which was base generated in the object position and then displaced to the subject position). To explore the likelihood of similarity-based interference during the real time linking of the verb and the sentence's subject noun, we manipulated the animacy feature of the noun phrases (matched or mismatched). The study's objectives were to examine whether (a) reducing the similarity-based interference by mismatching animacy features would modulate the encoding and retrieval dynamics of noun phrases in real-time; and (b) whether individuals with aphasia would demonstrate on time sensitivity to this lexical-semantic cue. Results revealed a significant effect of this manipulation in individuals both with and without aphasia. In other words, the mismatch in the representational features of the noun phrases increased the distinctiveness of the unaccusative verb's subject target at the time of syntactic retrieval (verb offset) for individuals in both groups. Moreover, individuals with aphasia were shown to be sensitive to the lexical-semantic cue, even though they appeared to process it slower than unimpaired listeners. This study extends to the cue-based retrieval model by providing new insight on the real-time mechanisms underpinning sentence comprehension.