Intrasubtest Scatter in Neuropsychology

Intrasubtest scatter (ISS) is a tendency toward unusual patterns in responses to test items (Wechsler, 1958). This is manifested as misfit to the Rasch model. The items on the Wechsler scales are ordered from easiest to most difficult using data from the standardization sample. Inconsistent patterns consist of getting some of the easier items wrong while getting the harder items right or a tendency toward isolated failures within long runs of correct responses.

Neuropsychological assessment and diagnosis often involves the detection and measurement of off-task or unusual responses in individuals suspected of cognitive dysfunction. An unusual item response pattern requiring neuropsychological interpretation occurs when an individual of high ability demonstrates a tendency to fail easy items. This response pattern may indicate cognitive inefficiency, difficulty with recall of specific information, or variable levels of arousal/attention. Almost all neuro-diagnostic interpretation of ISS has been conducted on adult populations using the Wechsler Adult Intelligence Scale (WAIS) or its revision (WAIS-R). In a review of their own and previous research, Mittenberg et al. (1989) concluded that scatter is associated with diffuse rather than focal neurological damage which results in either random loss of stored information or variable levels of arousal or attention. Surprisingly, the diagnostic utility of ISS has received comparatively little investigation in spite of frequently repeated recommendations that item scatter be interpreted as a qualitative indicator of cognitive dysfunction.

Our own study investigated the clinical utility of ISS in children experiencing attentional and information processing difficulties. The WISC-R item responses of 100 children who had received cranial irradiation treatment (with its risk of brain damage) for acute lymphoblastic leukaemia (ALL) were compared with those of 100 healthy children. The degree to which subjects in each of the two groups responded to items as predicted by an estimate of their ability was analyzed with a Rasch partial credit computer program (Adams & Khoo, 1993). The person Infit mean-square was the critical indicator.

Since brain-damage affects performance differentially, variation in subject ability across the eight Wechsler subtests was able to accurately classify 60% of the ALL cases and 70% of the controls. ISS subject misfit statistics derived from the eight subtests were able to classify correctly 61% of the ALL patients and 69% of the controls. The combination of ability variation across subtests and misfit within subtests correctly classified 71% of the ALL cases and 76% of the controls. The fact that the scatter misfit statistics were as successful as subtest ability variation in identifying subject type supports the conclusion that ISS is diagnostically useful for cases with relatively normal intellectual profiles.

With the widespread and increasing use of large scale testing programs for school-age students and with the computerized scoring of these test protocols, item scatter scores (misfit statistics) can become a screening measure for students with unrecognized cognitive dysfunction. Our research indicates that unusual response patterns on reading, spelling and arithmetic tests will prove to be useful pathognomic signs. A useful next step would be an investigation of the usefulness of item response variability on the Wide Range Achievement Test (WRAT, Jastak & Jastak, 1992, RMT 8:4 403-404) as a pathognomic sign.

Please see Psychological Assessment Resources for more information on the WRAT.

Adams, R.J. & Khoo, S. 1993. Quest, The Interactive Test Analysis System. Melbourne: ACER

Jastak, J. & Jastak, S. (1984) The Wide Range Achievement Test. Wilmington, DE: Jastak

Mittenberg, W., Thompson, G.B., Schwartz, J.A., Ryan, J.J., & Levit, R. (1991) Intellectual loss in Alzheimer's dementia and WAIS-R intrasubtest scatter. Journal of Clinical Psychology, 19, 420-423

Wechsler, D. (1958) Measurement and Appraisal of Adult Intelligence (4th ed.) Baltimore: Williams and Wilkins

Intrasubtest scatter in paediatric neuropsychology. Godber T, Anderson V. … Rasch Measurement Transactions, 1996, 9:4 p.469

Please help with Standard Dataset 4: Andrich Rating Scale Model

Rasch Publications
Rasch Measurement Transactions (free, online) Rasch Measurement research papers (free, online) Probabilistic Models for Some Intelligence and Attainment Tests, Georg Rasch Applying the Rasch Model 3rd. Ed., Bond & Fox Best Test Design, Wright & Stone
Rating Scale Analysis, Wright & Masters Introduction to Rasch Measurement, E. Smith & R. Smith Introduction to Many-Facet Rasch Measurement, Thomas Eckes Invariant Measurement: Using Rasch Models in the Social, Behavioral, and Health Sciences, George Engelhard, Jr. Statistical Analyses for Language Testers, Rita Green
Rasch Models: Foundations, Recent Developments, and Applications, Fischer & Molenaar Journal of Applied Measurement Rasch models for measurement, David Andrich Constructing Measures, Mark Wilson Rasch Analysis in the Human Sciences, Boone, Stave, Yale
in Spanish: Análisis de Rasch para todos, Agustín Tristán Mediciones, Posicionamientos y Diagnósticos Competitivos, Juan Ramón Oreja Rodríguez

To be emailed about new material on
please enter your email address here:

I want to Subscribe: & click below
I want to Unsubscribe: & click below

Please set your SPAM filter to accept emails from welcomes your comments:

Your email address (if you want us to reply):


ForumRasch Measurement Forum to discuss any Rasch-related topic

Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement

Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website,

Coming Rasch-related Events
Sept. 15-16, 2017, Fri.-Sat. IOMC 2017: International Outcome Measurement Conference, Chicago,
Oct. 13 - Nov. 10, 2017, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),
Oct. 25-27, 2017, Wed.-Fri. In-person workshop: Applying the Rasch Model hands-on introductory workshop, Melbourne, Australia (T. Bond, B&FSteps), Announcement
Jan. 5 - Feb. 2, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),
Jan. 10-16, 2018, Wed.-Tues. In-person workshop: Advanced Course in Rasch Measurement Theory and the application of RUMM2030, Perth, Australia (D. Andrich), Announcement
Jan. 17-19, 2018, Wed.-Fri. Rasch Conference: Seventh International Conference on Probabilistic Models for Measurement, Matilda Bay Club, Perth, Australia, Website
April 13-17, 2018, Fri.-Tues. AERA, New York, NY,
May 25 - June 22, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),
June 29 - July 27, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Further Topics (E. Smith, Winsteps),
Aug. 10 - Sept. 7, 2018, Fri.-Fri. On-line workshop: Many-Facet Rasch Measurement (E. Smith, Facets),
Oct. 12 - Nov. 9, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),
The HTML to add "Coming Rasch-related Events" to your webpage is:
<script type="text/javascript" src=""></script>


The URL of this page is