ASSESSMENT OF THE PSYCHOMETRIC PROPERTIES OF ATTITUDE TOWARDS ASSESSMENT TEST (ATAT)

A. M. Megbele, J. N. Odili, P. U. Osadebe

Abstract


This study is on the assessment of the psychometric properties of Attitude Towards Assessment Test (ATAT). Four research questions guided the study. An instrumentation research design was adopted. The population consisted of secondary school students in Delta State, Nigeria. The sample size comprised 1,000 students, selected through simple random and non-probability cluster sampling techniques. The test under assessment was developed by Megbele, et al. (2023). The Rasch Rating Scale Model was used to answer research 1, which assessed person and item reliability, item statistics, and ordering of response categories. Two fit indices including the infit and outfit mean square (MNSQ) statistics were used to answer research question 2. The data that were used to answer research questions 1 and 2 were analysed with the aid of the Jmetrik IRT software. The Categorical Confirmatory Factor Analysis (CCFA) was used to answer research question 3 while Chi-Square Goodness of Fit Statistics was used to answer research question 4 on construct validity, evidence of unidimensionality, and local independence respectively. The findings of this study revealed that the three components of the scale had high values of item separation index and reliability as well as an acceptable range of Person separation index and reliability; the difficulty index of items in the test was within an acceptable range; Each of the components of the Attitude Towards Assessment Test (ATAT) (Cognitive, affective and behavioural) had one construct each, which is evidence of unidimensionality for the different components of the Attitude Towards Assessment Test. Based on the findings of the study, it was concluded that all items in the different components that made up the test are reliable, have adequate item difficulty infit, and outfit MNSQ estimates, with evidence of unidimensionality and local independence assumptions. The study recommended that the test can be used by examination bodies for the assessment of students in the affective domain.

 

Article visualizations:

Hit counter


Keywords


psychometric properties; attitude towards assessment test; categorical confirmatory factor analysis; unidimensionality

Full Text:

PDF

References


Ceniza, J. C., & Cereno, D. C. (2012). Development of mathematic diagnostic test for DORSHS. Available from http://www.doscst.edu.ph/index.ph[/academics/graduateschool/publication/category/5-volum-1-issue-1-2012?

Cordier, R., Munro, N., Wilkes-Gillan, S., Speyer, R., Parsons, L., & Joosten, A. (2019). Applying Item Response Theory (IRT) Modeling to an Observational Measure of Childhood Pragmatics: The Pragmatics Observational Measure-2. Front. Psychol. 10, 408. doi:10.3389/fpsyg.2019.00408

Field, A. P. (2005). Discovering Statistics Using SPSS, Sage Publications Inc.

Fishbein, M., & Ajzen, I. (2010). Predicting and changing behavior. New York: Taylor & Francis.

Green, R. (2013). Statistical analysis for language testers. New York: Palgrave Macmillan.

Maydeu-Olivares, A., Cai, L., & Hernández, A. (2011). Comparing the Fit of Item Response Theory and Factor Analysis Models. Struct. Equation Model. A Multidisciplinary J. 18 (3), 333–356. doi:10.1080/10705511.2011.581993

Megbele, A. M., Odili, J. N., & Osadebe, P. U. (2023). Development of Attitude Towards Assessment Test for Secondary School Students in Delta State. Canadian Journal of Educational and Social Studies, 3(4), 120-132.

Muis, K. R., Winne, P. H., & Edwards, O. V. (2009). Modern Psychometrics for Assessing Achievement Goal Orientation: A Rasch Analysis. Br. J. Educ. Psychol. 79 (3), 547–576. doi:10.1348/000709908X383472

Penfield, R. D. (2014). An NCME Instructional Module on Polytomous Item Response Theory Models. Educ. Meas. Issues Pract. 33 (1), 36–48. doi:10.1111/emip.12023

Sharkness, J., & DeAngelo, L. (2011). Measuring Students’ Involvement: A Comparison of Classical Test Theory and Item Response Theory in the Construction of Scales from Student Surveys. Res. High Educ. 52 (5), 480–507. doi:10.1007/s11162-010-9202-3

Ubi, I. O., Umoinyang, I. E., & Joshua, M. T. (2011). Item local independence in selection examination in Nigeria. Education for Today: Journal of Faculty of Education, 7(1),175-188.

Yambi, T. A. C. (2018). Assessment and Evaluation in Education. Available from: https://www.researchgate.net/publication/342918149_ASSESSMENT_AND_EVALUATION_IN_EDUCATION

Zheng, Y. (2016). Online Calibration of Polytomous Items under the Generalized Partial Credit Model. Appl. Psychol. Meas., 40 (6), 434–450. doi:10.1177/0146621616650406




DOI: http://dx.doi.org/10.46827/ejoe.v8i3.5054

Refbacks

  • There are currently no refbacks.


Copyright © 2016-2023. European Journal of Open Education and E-learning Studies (ISSN 2501-9120) is a registered trademark of Open Access Publishing GroupAll rights reserved.

This journal is a serial publication uniquely identified by an International Standard Serial Number (ISSN) serial number certificate issued by Romanian National Library (Biblioteca Nationala a Romaniei). All the research works are uniquely identified by a CrossRef DOI digital object identifier supplied by indexing and repository platforms.

All the research works published on this journal are meeting the Open Access Publishing requirements and can be freely accessed, shared, modified, distributed and used in educational, commercial and non-commercial purposes under a Creative Commons Attribution 4.0 International License (CC BY 4.0).