HOW THE ORDER OF THE ITEMS IN A BOOKLET AFFECTS ITEM FUNCTIONING: EMPRICAL FINDINGS FROM COURSE LEVEL DATA?

Akif Avcu, Emine Burcu Tunç, Müge Uluman

Abstract


In many test applications, from larger to smaller scale, items could be given in different booklets with different ordering. Differing the place of items may cause items to function differently. The effect of item order is defined as the interaction between response of a test taker on any item and the order of the item on the test. When earlier studies examined, it can be seen that the item order effect shouldn’t not be ignored. In this study, it was aimed to examine whether items included in graduate level Measurement and Evaluation test showed differential item functioning (DIF) between different booklets. Booklet effect was investigated by a series of DIF detection methods and analysis were carried out using R program. The data collected with 25 multiple-choice item test. The test covers the curriculum of the undergraduate level Measurement and Evaluation course. When ordering the items in booklet, the sequence of curriculum was followed on one booklet (Booklet A) while, it is not the case for the second booklet (Booklet B). The results showed that the order of the items have an effect on DIF for some items. It was concluded that, item location may affect the probability of respondents to correctly answer to items.

 

Article visualizations:

Hit counter

DOI

Keywords


test booklet effect, Differential Item Functioning

Full Text:

PDF

References


Abedi J. Leon S. & Kao J. (2007). Examining differential item functioning in reading assessments for students with disabilities. Minneapolis. MN: University of Minnesota. Partnership for Accessible Reading Assessment.

Asil M. (2010). Uluslararası Öğrenci Değerlendirme Programı (PISA) 2006 öğrenci anketinin kültürler arası eşdeğerliğinin incelenmesi. Doktora Tezi. Hacettepe Üniversitesi. Ankara.

Bahar M. Nartgün Z. Durmuş S. & Bıçak B. (2010). Geleneksel-tamamlayıcı ölçme ve değerlendirme teknikleri öğretmen el kitabı. Ankara: Pegem Akademi.

Camilli G. and Shepard L. A. (1994). Methods for identifying biased test items (volume 4). California: SAGE Publications. Inc.

D. Magis S. Beland F. Tuerlinckx P. De Boeck (2010). A general framework and an R package for the detection of dichotomous differential item functioning. Behavior Research Methods. 42. 847-862.

Erdem B. (2015). Ortaöğretime geçişte kullanılan ortak sınavların değişen madde fonksiyonu açısından kitapçık türlerine göre farklı yöntemlerle incelenmesi. Yüksek Lisans Tezi. Hacettepe Üniversitesi. Ankara.

Feng Y. (2008). Difference in gender differential item functioning patterns across item format and subject area on diploma examinations after change in administration procedure. Unpublished doctor dissertation. University of Alberta. Edmonton. Alberta. Canada.

Freedle R. & Kostin I. (1991). The prediction of SAT reading comprehension item difficulty for expository prose passages (ETS Research Report RR-91-29). Princeton. NJ: Educational Testing Service.

Gelin M. N. & Zumbo B. D. (2003). DIF results may change depending on how an item is scored: An illustration with the Center for Epidemiological Studie Depression (CES-D) Scale. Educational and Psychological Measurement. 63. 65.

Girl M.J. & Khaliq S.N. (2001) Identifying sources of differential item and bundle functioning on translated achievement tests: A Confirmatory Analysis. Journal of Educational Measurement; Vol. 38. No. 2.; pp. 164-187.

Güler N. (2011). Eğitimde ölçme ve değerlendirme. Ankara: Pegem Akademi.

Hambleton R. K. (1968). The effects of item order and anxiety on test performance and stress. Paper presented at the meeting of American Educational Research Association. Chicago. Illinois. America.

Henderson D. L. (2001). Prevalence of gender DIF in mixed format high school exit examinations. Paper presented at the Annual Meeting of the American Educational Research Association. Seattle.

Henderson D. L. (1999). Investigation of differential item functioning in exit Examinations across item format and subject area. Unpublished doctor dissertation. University of Alberta. Edmonton. Alberta. Canada.

Karami H. and Nodoushan M. A. S. (2011). Differential item functioning (DIF): current problems and future directions. International Journal of Language Studies. 5-4: 133-142.

Karasar, N. (2005) Bilimsel Araştırma Yöntemi, Ankara: Nobel.

Kingston N. M.. & Dorans N. J. (1984). Item location effects and their implications for IRT equating and adaptive testing. Applied Psychological Measurement. 8. 147–154.

Liu L. O. & Wilson M. (2009). Gender Differences in Large-Scale Math Assessments. PISA Trend 2000 and 2003. Applied Measurement in Education. 22:2. 164- 184.

Lord, F. M. (1980). Applications of item response theory to practical testing problems (s. 181-223). Hillsdale NJ: Erlbaum.

Mantel, N., Haenszel, W. (1959) Statistical aspects of the analysis of data from retrospective studies of disease. J Natl Cancer Inst. 22: 719–748.

Mendes-Barnett S. ve Ercikan K. (2006). Examining sources of gender DIF in mathematics assessments using a confirmatory multidimensional model approach. Applied Measurement in Education. 19. 289-304.

Nevman D. Kundert D. K. Lane D. S. & Bull. S. (1988). Effect of varying item order on multiple-choice test scores: importance of statistical and cognitive difficulty. Applied Measurement in Education. 1 (1). 89-97.

Ong Y.M. Williams. J. S. and Lamprianou I. (2011). Exploration of the validity of gender differences in mathematics assessment using differential bundle functioning. International Journal of Testing. 11. 271-293.

Özçelik D. A. (2010b). Test hazırlama kılavuzu. Ankara: Pegem Akademi.

Qian J. (2014). An investigation of position effects in large-scale writing assessments. Applied Measurement in Education. 38(7). 518–534.

R Core Team (2017). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

Rizopoulos, D. (2006). ltm: An R package for Latent Variable Modelling and Item Response Theory Analyses, Journal of Statistical Software, 17 (5), 1-25. URL http://www.jstatsoft.org/v17/i05/

Roever C. (2005). That’s not fair! Fairness. bias. and differential item functioning in language testing. SLS Brownbag. 9(15). 1-14.

Schmitt A. P. & Dorans N. J. (1988). Differential item functioning for minority examinees on the SAT. Princeton. NJ: Educational Testing Service.

Tekin H. (1993). Eğitimde ölçme ve değerlendirme. Ankara: Yargı Kitap ve Yayınevi.

Tippets E. & Benson J. (1989). The effect of item arrangement on test anxiety. Applied Measurement in Education. 2(24). 289-296.

Tunç E. B. (2016). İki ve çok kategorili puanlanan maddelerde değişen madde fonksiyonlarının karşılaştırılması. Doktora Tezi. Ankara Üniversitesi. Ankara.

Yakar L. ve Yavuz S. (2014). Soru Kitapçıklarına Göre Değişen Madde Fonksiyonları. IV. Ulusal Eğitimde ve Psikolojide Ölçme ve Değerlendirme Kongresi Ankara. 9 - 13 Haziran 2014.

Yergin F. (2007). Çoktan seçmeli testlerde madde sıralamanın madde ve test istatistikleri üzerine etkisi. Yayımlanmış Yüksek Lisans Tezi. Mersin Üniversitesi. SBE.

Yılmaz H. B. (2014). Madde ve seçenek sıralamasının değişen madde fonksiyonuna (DMF) etkileri. IV. Ulusal Eğitimde ve Psikolojide Ölçme ve Değerlendirme Kongresi Ankara. 9 - 13 Haziran 2014.

Zenisky A. L. Hambleton R. K. & Robin F. (2003). DIF Detection and Interpretation in Large Scale Science Assessments: Informing Item Writing Practices. Center for Educational Assessment MCAS Validity Report No. 1. (CEA-429).

Zumbo B. D. (1999). A Handbook on the Theory and Methods of Differential Item Functioning (DIF): Logistic Regression Modeling as a Unitary Framework for Binary and Likert-Type (Ordinal) Item Scores. Ottawa. ON: Directorate of Human Resources Research and Evaluation. Department of National Defense.




DOI: http://dx.doi.org/10.46827/ejes.v0i0.1502

Refbacks

  • There are currently no refbacks.


Copyright (c) 2018 Akif Avcu, Emine Burcu Tunç, Müge Uluman

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2015-2023. European Journal of Education Studies (ISSN 2501 - 1111) is a registered trademark of Open Access Publishing Group. All rights reserved.


This journal is a serial publication uniquely identified by an International Standard Serial Number (ISSN) serial number certificate issued by Romanian National Library (Biblioteca Nationala a Romaniei). All the research works are uniquely identified by a CrossRef DOI digital object identifier supplied by indexing and repository platforms. All authors who send their manuscripts to this journal and whose articles are published on this journal retain full copyright of their articles. All the research works published on this journal are meeting the Open Access Publishing requirements and can be freely accessed, shared, modified, distributed and used in educational, commercial and non-commercial purposes under a Creative Commons Attribution 4.0 International License (CC BY 4.0).