International E-publication: Publish Projects, Dissertation, Theses, Books, Souvenir, Conference Proceeding with ISBN.  International E-Bulletin: Information/News regarding: Academics and Research

Dimensionality analysis of students’ performance in 2013 BGCSE agricultural examination: Implications for differential item functioning

Author Affiliations

  • 1Public Health Department, Kanye Seventh Day Adventist College of Nursing, Kanye, Botswana
  • 2Educational Foundations Department, University of Botswana, Gaborone, Botswana

Res. J.Educational Sci., Volume 5, Issue (1), Pages 1-10, January,1 (2017)

Abstract

Assessing desirable changes in learners\' behaviour are applied only if the measures in use provide valid outcome data for different subgroups. The deterioration of student performance in the Botswana General Certificate of Education (BGCSE) examination results is a disturbing trend that bothers parents, teacher, policy makers and government. This problem prompts this study on dimensionality analysis of students\' performance in 2013 BGCSE Agriculture Examination, implication for differential item functioning, to determine its fairness to all learners. The population for the study was all the 12784 students\' responses who sat for the 2013 BGCSE agriculture examination. Differential Item Functioning (DIF) analysis for each item was done according to gender and location type using logits test for t-test significance (p < .05). The findings from this study on gender based DIF indicated that twenty nine (29) out of the 40 items were DIF, seventeen (17) items favoured boys whereas twelve (12) favoured females. With location based, the DIF findings indicated that eighteen (18) were DIF, ten (10) favoured rural and eight (8) favoured urban students. In conclusion, the results of this study, as it explored the national assessment tool, showed that 2013 BGCSE Agriculture Examination was not fair to all students. It was, therefore recommended that test developers and examination bodies should consider improving the quality of their test items by conducting IRT psychometric analysis for validation of DIF purpose among others.

References

  1. Nenty H.J. (1985)., Fundamental of measurement and evaluation in education., Unpublished monograph, University of Calabar, Nigeria.
  2. Hunyepa J. (2014)., Khama must be petitioned over problems in education., The Botswana Gazette, News, Botswana (pty) Ltd, Gaborone, Botswana.
  3. Utlwang A. (2003)., The localization of Cambridge School Examinations as a quality assurance measure., AEAA Conference, Cape Town, South Africa.
  4. Thobega M. and Masole T.M. (2008)., Predicting students’ performance on agricultural science examination from forecast grades., US-China Education Review, 5(10), 45-52.
  5. Adedoyin O.O. and Mokobi T. (2013)., Using IRT psychometric analysis in examining the quality of junior certificate mathematics multiple choice examination test items., International Journal of Asian Social Science, 3(4), 992-1011.
  6. Brian F.F., Daniel H.B. and William E.F. (2007)., The psychometric properties of the agricultural hazardous occupation order certification training program on written examinations., Journal of Agricultural Education, 48(4), 11-19.
  7. Lee J. (2002)., Racial and ethnic achievement gap trends: Reversing the progress toward equity?, Educational Researcher, 31(1), 3-12.
  8. R Paige, E Hickok (2004)., No child left behind: A toolkit for teachers., U.S. Department of Education, Office of the Deputy Secretary, Washington DC: Author.
  9. Nenty H.J.O. A. Afemikhe& J. C. Adewale (Eds.) (2004), ., Issues ineducational measurement and evaluation inNigeria (in honour of Professor WoleFalayajo) (pp.371–383). Ibadan., Institute of Education, University of Ibadan, Nigeria, From CTT to IRT: An introduction to a desirable transition
  10. Mellenbergh G.J. (1994)., A unidimensional latent trait model for continuous item responses., Multivariate Behavioral Research, 29, 223-236.
  11. Lord F.M. (1980)., Applications of item response theory to practical testing problems., Hillsdale, NJ: Lawrence Erlbaum Associates.
  12. Pido S. (2012)., Comparison of item analysis results obtained using item response theory and classical test theory approaches., Journal of Educational Assessment in Africa, 7, 192- 207.
  13. Hambleton R.K., Swaminathan H. and Rogers H.J. (1991)., Fundamentals of item response theory., Sage: Newbury Park
  14. Nenty H.J. (2008)., Constructing, administering, scoring and interpreting from educational Instruments., Educational Foundations, University of Botswana, Gaborone, Botswana.
  15. Nenty H.J. (2010)., Gender-bias and human resources development: Some measurement considerations., Ilorin Journal of Education, 29, 13-26.
  16. Oche E.S. (2012)., Issues in test item bias in public examinations in Nigeria: Implications for testing., International Journal of Academic Research in Progressive Education and Development, 1(1), 179 -187.
  17. Nenty H.J. (2013)., Fundamentals of quantitative research education., Book under preparation, University of Botswana, Gaborone.
  18. Republic of Botswana. Ministry of Education (2013)., Request for quotation for the evaluation of declining results in basic education sector (primary, junior and senior secondary) since 2007 to date- 2013., Gaborone, Botswana: Government printer.
  19. Republic of Botswana (2002)., Botswana examination council act., Gaborone, Botswana
  20. Kalaycioglu D.B. and Berberoglu G. (2010)., Differential item functioning analysis of the science and mathematics items in the university entrance examinations in Turkey., Journal of Psychoeducational Assessment, 29(5), 467-478.
  21. Robin F, Zenisky A.L and Hambleton (2003)., DIF detection and interpretation in large scale science assessments: Informing item writing practices., University of Massachusetts, Amherst and Frederic Robin Educational Testing Service.
  22. Magno C. (2009)., Demonstrating the difference between classical test theory and item response theory using derived test data., The International Journal of Educational and Psychological Assessment, 1(1), 1-11.
  23. Adedoyin O.O. (2010)., Using IRT approach to detect gender biased items in public examinations: A case study from the Botswana junior certificate examination in mathematics., Educational Research and Reviews, 5(7), 385-399. http://www.academicjournals.org/ERR2.
  24. Amuche C.I. and Fan A.F. (2014)., An assessment of item bias using differential item functioning technique in Neco biology conducted examinations in Taraba State Nigeria., American International Journal of Research in Humanities, Arts and Social Sciences, 6(1), 95-100.
  25. Eng L.S. and Hoe L.S. (2005)., Detecting differential item functioning (DIF) in standardized multiple-choice test: An application of item response theory (IRT) Using three Parameter Logistic Model., University of Technology, Mara Sarawak.
  26. Mokobi T. and Adedoyin O.O. (2014)., Identifying location biased items in the 2010 Botswana junior certificate examination mathematics paper one using the item response characteristics curves., International Review of Social Sciences and Humanities, 7(2), 63-82.
  27. Boone J.W. and Scantlebury K. (2006)., The role of Rasch analysis when conducting science educational research utilizing multiple choices tests., Wiley InterScience. www.interscience.wiley.com.
  28. Nenty H.J., Odili J.N. and Munene-Kabanya A.N. (2008)., Assessment training among secondary school teachers in Delta State of Nigeria: Implication for sustaining standards in educational assessment., Journal of Educational Assessment in Africa, 3, 110-123.