Pre-morbid IQ continues to influence a range of cognitive tests after AD diagnosis. The effect size is modest, around 2.7% for every 10 IQ points, compared with its effect in non-demented older people . Unsurprisingly, the effect is greater for verbal compared with non-verbal tests. There was a trend for higher pre-morbid IQ to improve performance on cognitive scores over time for most tests relative to lower pre-morbid IQ, with higher pre-morbid IQ being particularly associated with relative improvement in lexical verbal fluency, though this may reflect a relative lack of novelty on repeated testing. Being older also has a detrimental effect on cognitive scores, even in the presence of dementia, but has no effect on change in cognitive ability over time. As expected, given the selection of responders to continue treatment, there was no significant effect of treatment on cognitive change over time. However, despite this cognitive stability, ADL scores deteriorated, especially instrumental activities of daily living. This may reflect a greater sensitivity of IADL to change compared with PSMS. The effect size of pre-morbid IQ on ADL scores was similar to that of its effect on cognition. The effect size was similar to that seen cross-sectionally . Again, those with higher pre-morbid IQs seemed to be relatively protected from any deterioration in ADL over time. However, once the effect of contemporaneous cognitive ability was adjusted for, pre-morbid IQ no longer exerted a significant effect on ADL. Thus the effects of pre-morbid IQ on ADL appear to be mediated via its effects on current cognitive ability in AD. In this sample, younger men scored significantly better on instrumental activities of daily living, perhaps reflecting persisting social roles. This effect was distinct from any effect of living alone, suggesting that it could not be explained purely by the likelihood of younger men having wives who were still alive. It may, on the other hand, be a cohort effect of lower pre-morbid prevalence of performing household tasks like cooking and laundry in men who married before the end of the Second World War.
The effect of pre-morbid IQ on cognition in this longitudinal study was considerably less than its effect on MMSE on a cross-sectional basis . This is likely to reflect the superior design of longitudinal studies that can adjust for within-subject effects that might otherwise be attributed to a stable trait such as pre-morbid IQ. Despite the moderate effect size, its presence supports the persisting effect of cognitive reserve on the absolute level of cognition, and to a lesser extent the rate of cognitive decline, even in the presence of a dementing illness. Cognitive reserve is also important for ADL, but only inasmuch as it protects against cognitive impairment. This contrasts with its effect on the behavioural and psychological symptoms of dementia where it is pre-morbid IQ rather than contemporaneous cognitive ability that is protective . Our data are consistent with those of Pavlik and colleagues who followed up 478 AD patients over 3.2 years . They investigated the effect of the American version of the NART on MMSE and ADAS-Cog scores together with the Clinical Dementia Rating Scale and found higher pre-morbid IQ, but not education, protected against decline on these global cognitive and functioning outcomes. By contrast a study from the Baylor College of Medicine  did not find such an effect on MMSE score decline, though the sample was smaller and thus may have been under-powered. Similarly Drachman and colleagues found few significant effects on the rate of decline in AD despite studying a wide range of predictor variables in a sample of just 42 patients . On the other hand Mortimer and colleagues found a significant association between psychotic symptoms and the rate of cognitive decline in a sample of 65 patients ; we also reported a link between psychotic symptoms and cognitive status in our patients . A Chicago study, which used a composite measure of pre-morbid reading ability, also failed to find a significant effect of this once other variables, including race, were adjusted for .
Though longitudinal studies have advantages for estimating effect sizes, they often suffer from the effects of attrition; Pavlik and colleagues also had over 90% attrition at the fifth wave of observation . Indeed, there was deliberate selection involved in this study. Mixed linear models can adjust for such attrition to some degree because they include data from all participants, not just those who had observations made at every wave. Moreover, by using dummy variables indicating presence or absence at any wave, we could test for any attrition bias. There was none for NARTIQ; that is, people with lower pre-morbid IQs were no less likely to be assessed at week 26 (p = .13) or to be responders (p = .11). This finding, though, needs to be considered in the context of selection bias towards higher cognitive test scores for those patients who had a NARTIQ score. There were thus relatively fewer patients with very low cognitive tests scores in the sample analysed and thus it is possible that an attrition bias may have been found if all patients had had a NARTIQ and been included in the longitudinal study. Moreover, there was a selection bias for ADL, with people who were better at instrumental activities of daily living at baseline being more likely to be considered as responding to drug treatment. Hence, longitudinal effects of pre-morbid IQ on ADL may be less certain in those who presented with a lower level of self care.
In addition to attrition bias on ADL scores, the sample had an estimated pre-morbid IQ a little above average. However, given the mean age of the sample, this would be expected given the association between lower childhood IQ and shorter life expectancy . The sample is also selected because of the nature of referral pathways to a tertiary clinic. Not only did the patients have to present to their general practitioner, but then had to be referred to a consultant and by the consultant to the memory treatment centre. This process generally took some time and thus patients who were deteriorating rapidly were unlikely to have reached the clinic. It may be, therefore, that any effects of cognitive reserve would have been less in patients who did not reach the clinic. Another limitation of the study is the range of cognitive tests. There were several cognitive domains that were not specifically tested that would be more or less influenced by cognitive reserve in AD. In addition, there were a number of explanatory factors that we did not take into account. Higher depression scores were associated with increased rate of decline in 102 Catholic clergy with AD . However, low mood correlates with NARTIQ in AD patients , which was not taken into account, so that their findings could be explained by pre-morbid ability. One explanatory factor that was not open to us to investigate was the effect of race  because of the limited ethnic variation of Lothian residents. Nor did we take into account apolipoprotein E ε 4 status which has been implicated as a predictor of decline in AD . These limitations indicate that further studies in other samples that can consider other explanatory variables would be useful.