It has been argued that item difficulty can affect the fit of a confirmatory factor analysis (CFA) model (McLeod, Swygert, & Thissen, 2001; Sawaki, Sticker, & Andreas, 2009). We explored the effect of items with outlying difficulty measures on the CFA model of the listening module of International English Language Testing System (IELTS). The test has four sections comprising 40 items altogether (10 items in each section). Each section measures a different listening skill making the test a conceptually four-dimensional assessment instrument.
We observed two items with outlying low Rasch difficulty measures, but poor fit to the Rasch model, in section 1 (measure of item 8 = -1.71, infit MNSQ = 1.43; measure of item 9 = -1.59, infit MNSQ = 1.36) and an item with an outlying high Rasch difficulty measure, and good fit to the Rasch model, in section 4 (measure of item 38 = 3.01, infit MNSQ = 0.99). There was a large gap between these items and the rest of the items in each section on the Wright map.
Initially, we proposed separate CFA models for sections 1 and 4 to investigate the causes of variations in the measurements (items). In each model was a latent trait measured by 10 items. The 10-item CFA model for section 1 had a significant chi-square index (indicating rejection of the null hypothesis of one factor) although other fit indexes fell within the acceptable range (Table 1). The two outlying items did not load significantly on the latent trait at 5%. In a post hoc modification stage, we removed items 8 and 9 from the analysis and calculated the fit of the modified 8-item CFA model. We observed a noticeable improvement in the fit of the items to the one-factor model. We expected this because of the bad fit of the items to the Rasch model.
Likewise, we calculated the CFA model for section 4 with 10 items, which also did not exhibit acceptable fit indexes. Item 38 which had a high difficulty measure outlying from the rest of the items was deleted and a noticeably better fit to the one-factor model was obtained. This was somewhat surprising, because the deleted item exhibited good fit to the unidimensional Rasch model.
This analysis is supportive of the results from previous studies which show item difficulty can affect the fit of the CFA models. Items with outlying difficulty measures can compromise the fit of CFA models. So, it may be useful that we delete items with outlying Rasch difficulty measures prior to conducting any CFA or in the post hoc modification stages.
S. Vahid Aryadoust
Nanyang Technological University, Singapore
References
McLeod, L.D., Swygert, K.A., & Thissen, D. (2001). Factor analysis for items scored in two categories. In D. Thissen & H. Wainer (Eds.), Test Scoring (pp. 189-216). Hillsdale, NJ: Lawrence Erlbaum Associates.
Sawaki, Y., Sticker, L.J., & Andreas, H.O. (2009). Factor structure of the TOEFL Internet-based test. Language Testing 26(1), 5-30.
See also "Too Many Factors", RMT 8:1, 347.
Model | χ² | df | χ²/df | NNFI | CFI | GFI | RMSEA | RMSEA 90% confidence interval |
Section 1 (10 items) | 29.68* | 35 | 0.85 | 1.06 | 1.00 | 0.95 | 0.001 | 0.001 to 0.051 |
Section 1 (8 items) | 10.54 | 20 | 0.53 | 1.12 | 1.00 | 0.98 | 0.001 | 0.001 to 0.001 |
Section 4 (10 items) | 72.90** | 35 | 2.08 | 0.94 | 0.95 | 0.88 | 0.100 | 0.067 to 0.130 |
Section 4 (9 items) | 54.54* | 34 | 1.60 | 0.96 | 0.97 | 0.91 | 0.076 | 0.036 to 0.110 |
Constraint tenable | Non-significant | - | < 2 | .95 | .90 | .90 | < 0.06 | Narrow interval |
Note. n = 148. **p < 0.001. *p < 0.01. df = degree of freedom. NNFI = Non-Normed Fit Index. CFI = Comparative Fit Index. GFI = Goodness of Fit Index. RMSEA = Root Mean Square Error of Approximation. |
For more information,
The Impact of Rasch Item Difficulty on Confirmatory Factor Analysis , S.V. Aryadoust
Rasch Measurement Transactions, 2009, 23:2 p. 1207
Confirmatory factor analysis vs. Rasch approaches: Differences and Measurement Implications, M.T. Ewing, T. Salzberger, R.R. Sinkovics
Rasch Measurement Transactions, 2009, 23:1 p. 1194-5
Conventional factor analysis vs. Rasch residual factor analysis, Wright, B.D.
2000, 14:2 p. 753.
Rasch Analysis First or Factor Analysis First? Linacre J.M.
1998, 11:4 p. 603.
Factor analysis and Rasch analysis, Schumacker RE, Linacre JM.
1996, 9:4 p.470
Too many factors in Factor Analysis? Bond TG.
1994, 8:1 p.347
Comparing factor analysis and Rasch measurement, Wright BD.
1994, 8:1 p.350
Factor analysis vs. Rasch analysis of items, Wright BD.
5:1 p.134
Aryadoust S.V. (2009) The Impact of Rasch Item Difficulty on Confirmatory Factor Analysis, Rasch Measurement Transactions, 2009, 23:2, 1207
Forum | Rasch Measurement Forum to discuss any Rasch-related topic |
Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement
Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.
Coming Rasch-related Events | |
---|---|
Oct. 4 - Nov. 8, 2024, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
Jan. 17 - Feb. 21, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
May 16 - June 20, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
June 20 - July 18, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com |
Oct. 3 - Nov. 7, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
The URL of this page is www.rasch.org/rmt/rmt232c.htm
Website: www.rasch.org/rmt/contents.htm