What We Have To Offer

My aim is to invite you to take action with respect to the emergence and consolidation of the movement toward fundamental measurement in education and social science. This letter is prompted by some things that happened at the San Francisco AERA meetings. Several SIG members commented that our presentations were overly technical, with poorly made and poorly designed visuals, and that our communications skills were lacking dynamism and clarity.

Poor communication is bad enough, but when the speakers are presenting work that offers stunning possibilities to the future of a field, poor communication takes on tragic overtones. Initial discussions about this problem centered on improving communication skills - how to write and speak clearly, and how to make and use legible overheads. The idea was to use the autumn MOMS on December 8 to experiment with next year's AERA presentations.

Though these communication problems need to be addressed, discussion has led to recognition of the possibility that they are symptoms of a deeper problem. Beneath the surface of the communication problem lies the fact that Rasch measurement has gone through some important changes in the 1980's. These changes are part of an evolution that proceeds as much in terms of value choices and power plays as in terms of the irresistible emergence of new scientific structures. We need to be asking ourselves some questions about what we are doing: Where did this movement come from? Where is it at right now? How did it get here? Where is it going? What are the alternative directions that might be taken? By what criteria do we choose between them? This letter cannot answer these questions, nor can it even articulate each of them properly; but it can raise them, and in pointing to the need to address them open them up for discussion.

We could begin by taking a lesson from the history of science. Kuhn, Toulmin, and Fleck show that there is not a one-way, linear relation between the development of theory and the development of technology in the practice of a scientific discipline. Technological development occurs in contexts that are largely free of theorizing about technical effects. Science enters the picture in the course of trying to understand the whys and wherefores of the effects of a technical development. A back and forth motion between theory and observation ensues with each modifying the other (Price 1986, Heelan 1983). Questions as to how and why an effect occurred arise. Attempts at recreating and altering the effect are made, with the aim of understanding the conditions of its existence.

Sometimes a clearly delineated effect becomes widely recognized as particularly useful only in special political or economic circumstances. The effect may not be politically or economically viable until demand for it reaches a critical point, as in a crisis. As a technology distinguishes itself and specialists begin the practical task of establishing its field of application, the specialty group begins to articulate theoretical terms to use as a cognitive base from which to explain the practical meaning of the technology (Gritzer and Arluke 1985: 6-8).

Innumerable quotations from the literature of educational measurement could be marshalled to show that fundamental measurement offers a clearly defined and essential product - sample-free instrument calibration and instrument-free person measurement - and that there is great demand for this product, since it constitutes the most crucial assumption, goal and requirement of objective measurement. We also have widespread recognition that education is in crisis. What is lacking is an understanding of the basic relation between fundamental measurement and the crisis of education. The crisis has come about, in part, because the requirements of measurement are not being met. When our meter-sticks are not calibrated to measure objectively on stable number lines of more and less, we risk the possibility that not one but several contradictory standards are being set at the same time. In order to participate in solutions to the crisis, educators need to be sure they are not imposing double standards. For this we need to make fundamental measurement more recognizable and accessible to educators. This is what was referred to above as articulating theoretical terms to use as a cognitive base for explaining the technology of fundamental measurement.

The cognitive base of fundamental measurement extends far beyond the confines of education through all science into philosophy and into the structure of measurement as it was realized in the course of the history of science. This is why the term "fundamental" and not "Rasch" is placed before the word "measurement" as frequently as possible in this piece. The dissemination of fundamental measurement - which I'm referring to in terms of the articulation of its cognitive base - requires that we make it as accessible as possible. We must deal with the broad issues of measurement and how they have worked in the history of science to resolve crises not unlike that of education today, if our work is to receive the application it deserves. All models of fundamental measurement work the way they do because of the way they provide a context for the manifestation of nascent possibilities already present in the history of science. Rasch and Wright gave us a lot, but Rasch's name has become too parochial a focus for confrontation and too remote a landmark to guide us back to the sources of what is powerful about science.

Such comments may strike some as off base, given the name of our SIG. I do not mean that standard references, such as Rasch (1960), should be eliminated; I mean only that there is an important sense in which what Rasch articulated was not new, and that we overly limit ourselves by putting our work strictly within the confines of "the Rasch model". What Rasch and Wright have articulated that is new are specific criteria for recognizing data of a quality high enough for scientific objectivity to be achieved and maintained; but the originality of that articulation does not mean that the criteria themselves were not already at work in science. The point of my philosophical work is to show that the same criteria have in fact been articulated in a different language by many of those who have pursued philosophical and historical research into the nature of objectivity.

The following six proposals are offered:

A. Concerning our cognitive base of operations:

1 We should follow Wright's (1984, 1989) lead and use the history and theory of fundamental measurement, traced from N. R. Campbell through Thurstone to Guttman, Rasch and Luce and Tukey, instead of "the Rasch model", as the foundation for our ideas about what measurement is and what it requires. More adventuresome souls can pursue this strategy in greater detail through my work (Fisher 1988, 1989) and David Andrich's Kuhn paper (Andrich 1987).

2 Because the power of the idea that Rasch implements has been at work in science from its birth in ancient Greece, to lump his work in with the IRT or latent trait gang is to grossly devalue the worth, if not miss the point, of his contribution. In terms of the history of its development and the philosophical weight of its achievement, Rasch's work is not closely related to the IRT theories, and should be sharply distinguished from them.

B. Concerning our field's awareness of what we have to offer:

3 More of our research should address the crisis of education. We need to show that the general lack of valid scientific instrumentation in the classroom, the school districts and in the testing agencies has played an important role in the development of the crisis situation. How and why did this lack occur? How and why does it continue to go unnoticed? How can fundamental measurement help alleviate the crisis? What fields outside of education need the techniques of fundamental measurement in order to address the social, political and economic inequalities that contribute to the educational crisis? How can these other fields use these techniques? How will education improve because of their application?

4 Most of the published work involving fundamental measurement is theoretical. Although there is a great deal of successful practice, it goes unnoted in the journals. Despite the fact that practitioners become overwhelmed with important work because of the success of their measurement efforts, they must do more to provide practical examples, experimental designs, data analyses and interpretations that employ fundamental measurement so that others can learn about the tools they too need to accomplish their goals.

C. Concerning the further clarification of what we have to offer:

5 We need to make measurement easier to understand and to do. That means simple, user- friendly, and inexpensive computer programs. These programs should neither begin nor end with data analysis. Users need help in designing tests and questionnaires that will produce high quality data and they need help in interpreting the data once it is analyzed. Though Wright and Stone (1979), Wright and Masters (1982), MSCALE (Wright, Congdon and Rossner 1988), and DICOT (Masters 1984) took us a long way in the right direction, we need still less sophisticated packages accessible to teachers, undergraduates and computer novices.

6 None of the above should be taken to mean that all the important technical and theoretical work on measurement has been done. We still face difficult problems in need of solution: the use of full information error estimates (Adams 1988), the simultaneous estimation of differing uni-dimensional continua in the same data (Linacre 1989), the ongoing problems with fit statistics, and the continuing efforts to improve the efficiency of the estimation algorithms.

These suggestions are made on the basis of indications from the history of science as to how things are likely to work out in the course of the emergence of a technological innovation. Each of the six points is already undertaken to some extent. Practical applications have been underway for some time. A crisis has been caused by an absence of useful measurement information. What remains is for fundamental measurement to be related to the crisis via substantive evidence, and for the relationship to be explained in the objective terms that have proven most productive in the history of science. These are not small tasks, but addressing them directly as part of a larger strategy will increase the chance that the methodological problems of education will be effectively dealt with sooner rather than later.

William P. Fisher, Jr.

Adams, Raymond. 1988. Full information standard errors for Rasch measurement. Paper presented to the Midwest Objective Measurement Seminar, University of Chicago, December 2.

Andrich, David. 1987. Education and other social science measurement: A Kuhnian revolution in progress. Unpublished manuscript.

Fisher, William P. 1988. Truth, Method and Measurement: The Hermeneutic of Instrumentation and the Rasch Model. Unpublished dissertation, University of Chicago, Department of Education.

Fisher, William P. 1989. Objectivity in measurement: The historical source of Rasch separability. Revised version of a paper presented at the Fifth International Objective Measurement Workshop, University of California, Berkeley, March 26.

Gritzer, Glenn and Arnold Arluke. 1985. The Making of Rehabilitation: A Political Economy of Medical Specialization, 1890-1980. Berkeley: University of California Press.

Heelan, Patrick. 1983. Natural science as a hermeneutic of instrumentation. Philosophy of Science 50(June): 181-204.

Linacre, John M. 1989. FACETS: Many-facet Rasch analysis by computer. Chicago: MESA Press.

Masters, Geofferey. 1984. DICOT: Analyzing classroom tests with the Rasch model. Educational and Psychological Measurement 44: 145-150.

Price, Derek de Solla. 1986. Of sealing wax and string: A philosophy of the experimenter's craft and its role in the genesis of high technology. In Little Science, Big Science ... and Beyond by D. S. Price. New York: Columbia University Press.

Rasch, Georg. 1960. Probabilistic Models for Some Intelligence and Attainment Tests. Copenhagen: Danmarks Paedogogiske Institut; reprint, with Foreword and Afterword by Benjamin D. Wright, Chicago: University of Chicago Press, 1980.

Wright, Benjamin D. 1984. Despair and hope for educational measurement. Contemporary Education Review 3(1): 281-288.

Wright, B.D. 1989. Deducing the Rasch model. Paper at the Fifth International Objective Measurement Workshop, University of California, Berkeley.

Wright, Benjamin D. and Geofferey Masters. 1982. Rating Scale Analysis. Chicago: MESA Press.

Wright, Benjamin D., Richard Congdon and Marc Rossner. 1988. MSCALE: A Rasch program for ordered categories. Chicago: MESA Press.

Wright, Benjamin D. and Mark Stone. 1979. Best Test Design. Chicago: MESA Press.


What we have to offer. Fisher WP Jr. … Rasch Measurement Transactions, 1989, 3:3 p.72



Rasch Publications
Rasch Measurement Transactions (free, online) Rasch Measurement research papers (free, online) Probabilistic Models for Some Intelligence and Attainment Tests, Georg Rasch Applying the Rasch Model 3rd. Ed., Bond & Fox Best Test Design, Wright & Stone
Rating Scale Analysis, Wright & Masters Introduction to Rasch Measurement, E. Smith & R. Smith Introduction to Many-Facet Rasch Measurement, Thomas Eckes Invariant Measurement: Using Rasch Models in the Social, Behavioral, and Health Sciences, George Engelhard, Jr. Statistical Analyses for Language Testers, Rita Green
Rasch Models: Foundations, Recent Developments, and Applications, Fischer & Molenaar Journal of Applied Measurement Rasch models for measurement, David Andrich Constructing Measures, Mark Wilson Rasch Analysis in the Human Sciences, Boone, Stave, Yale
in Spanish: Análisis de Rasch para todos, Agustín Tristán Mediciones, Posicionamientos y Diagnósticos Competitivos, Juan Ramón Oreja Rodríguez

To be emailed about new material on www.rasch.org
please enter your email address here:

I want to Subscribe: & click below
I want to Unsubscribe: & click below

Please set your SPAM filter to accept emails from Rasch.org

www.rasch.org welcomes your comments:

Your email address (if you want us to reply):

 

ForumRasch Measurement Forum to discuss any Rasch-related topic

Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement

Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.

Coming Rasch-related Events
May 17 - June 21, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 12 - 14, 2024, Wed.-Fri. 1st Scandinavian Applied Measurement Conference, Kristianstad University, Kristianstad, Sweden http://www.hkr.se/samc2024
June 21 - July 19, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Winsteps), www.statistics.com
Aug. 5 - Aug. 6, 2024, Fri.-Fri. 2024 Inaugural Conference of the Society for the Study of Measurement (Berkeley, CA), Call for Proposals
Aug. 9 - Sept. 6, 2024, Fri.-Fri. On-line workshop: Many-Facet Rasch Measurement (E. Smith, Facets), www.statistics.com
Oct. 4 - Nov. 8, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
Jan. 17 - Feb. 21, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
May 16 - June 20, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 20 - July 18, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com
Oct. 3 - Nov. 7, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com

 

The URL of this page is www.rasch.org/rmt/rmt33d.htm

Website: www.rasch.org/rmt/contents.htm