Reflection of the Test-Item Quality in State SMP and SMA in Bandar Lampung

Ujang Suparman

Abstract


The objectives of this research are to analyze critically the quality of test items used in SMP and SMA (mid semester, final semester, and National Examination Practice) in terms of reliability as a whole, level of difficulty, discriminating power, the quality of answer keys and distractors. The methods used to analyze the test items are item analysis (ITEMAN), two types of descriptive statistics for analyzing test items and another for analyzing the options. The findings of the research are very far from what is believed, that is, the quality of majority of test items as well as key answers and distractors are unsatisfactory. Based the results of the analysis, conclusions are drawn and recommendations are put forward.

Key words: assessment, quality of test items, test item analysis

Full Text:

PDF PDF

References


Astawa, I. N., Handayani, N. D., Mantra, I. B. N., & Wardana, I. K. (2017). Writing English language test items as a learning device: a principle of habit formation rules. International Journal of Social Sciences and Humanities, 1(3), pp. 135-144. https://doi.org/10. 29332 /ijssh.v1n3.67

Bajpai, S. & Bajpai, R. (2014). Goodness of measurement: Reliability and validity. International Journal of Medical Science and Public Health, Volume 3, Issue 2, 22014, pp. 112-115. DOI: 10.5455/ijmsph. 2013.191120133

Black, P. & Wiliam, D. (1998). Assessment and Classroom Learning. Assess-ment in Education, Vol. 5, No. 1, 1998, pp. 7-73. Journal homepage: http://www. Tandfon-line.com/loi/caie20

Bolarinwa, O.K. (2015). Principles and methods of validity and reliability testing of questionnaires used in social and health science researches. Nigerian Postgraduate Medical Journal, Vol. 22, Issue 4, 2015. pp. 195-201. DOI: 10.4103/ 1117-1936.173959

Boopathiraj, C. & Chellamani, K. (2013). Analysis of test items on difficulty level and discrimination index in the test for research in education. International Journal of Social Science & Interdisciplinary Research, (IRJC) Vol.2 (2), February (2013), pp. 189-193. Online available at indianresearch journals.com

Burud, I., Nagandla, K. & Agarwal, P. (2019). Impact of distractors in item analysis of multiple choice questions. International Journal of Research in Medical Sciences, 2019 Apr;7(4), pp. 1136-1139. www.msjonline.org DOI: http://dx.doi.org/10.18203/2320-6012. Ijrms 20191313

Büyükkarcı, K. (2014). Assessment Beliefs and Practices of Language Teachers in Primary Education. International Journal of Instruc-tion, January 2014, Vol.7, No.1, pp. 107-120. http://www.e-iji.net

Çanakkale, G.T. & Çanakkale, G.M. (2013). Developing a Science Process Skills Test Regarding the 6th Graders. The International Journal of Assessment and Evaluation, Volume 19, 2013, pp. 39-57. http://thelearner.com/

Chauhan, P.R, & Bhoomika, C. (2013). Study of difficulty level and discriminating index of stem type multiple choice questions of anatomy in Rajkot. Biomirror, Volume 4(06) :1-4(2013), pp. 1-4.

Chauhan, P., Chauhan, G.R., Chauhan, B.R., Vaza, J.V. & Rathod, S.P. (2015). Relationship between difficulty index and distracter effectiveness in single best-answer stem type multiple choice questions. International Journal of Anatomy and Research, Int J Anat Res 2015, Vol 3(4), pp. 1607-10. DOI: http://dx.doi.org/10.16965/ijar.2015. 299

D'Sar, J.L., & Visbal-Dionaldo, M.L. (2017). Analysis of Multiple Choice Questions: Item Difficulty, Discri-mination Index and Distractor Efficiency. Internation-al Journal of Nursing Education, July-September 2017, Vol.9, No. 3, pp. 109-114.

Gronlund, N.E. & Waug, C.K. (2009). Assessment of student achievement. Upper Saddle River, New Jersey: Pearson.

Hinchliffe, J. (2014), CQ university scraps multiple choice exams in an australian first. available at: www.abc.net.au/ news/2014-09-23/cqu-scraps-multiple-choice-exams-in-an-australian-first/5763226 (accessed 28 July 2017), ABC News, 23 September 2014.

Ibili, E. & Billinghurst, M. (2019). Assessing the Relationship between Cognitive Load and the Usability of a Mobile Augmented Reality Tutorial System: A Study of Gender Effects. International Journal of Assessment Tools in Education, 2019, Vol. 6, No. 3, pp. 378–395. https://dx.doi.org/ 10.214 49/ijate.594749; http://www. ijate. net; http:// dergipark.org.tr/ijate

Gokdas, I. & Kuzucu, Y. (2019.) Social Network Addiction Scale: The Validity and Reliability Study of Adolescent and Adult Form. International Journal of Assess-ment Tools in Education, 2019, Vol. 6, No. 3, pp. 396–414. https://dx.doi.org/ 10.21449/ijate.505863 http://dergi park.org.tr/ijate

Haidari, S.M. & Karakuş, F. (2019). Safe Learning Environment Perception Scale (SLEPS): A Validity and Reliability Study. International Journal of Assessment Tools in Education, 2019, Vol. 6, No. 3, pp. 444–460. https://dx.doi.org /10.21 449/ijate.505863; http://dergipark.org. tr/ijate

Hassan, S. & Hod, R. (2017). Use of item analysis to improve the quality of single best answer multiple choice question in summative assessment of undergraduate medical students in malaysia. Education in Medicine Journal. 2017; 9(3), pp. 33–43. www.eduimed.com Penerbit Uni-versiti Sains Malaysia. 2017. https: //doi.org/10.21315/eimj 2017. 9.3.4

He, J., Barrera-Pedemonte, F. & Buchholz, J. 2018. Cross-cultural comparabil-ity of noncognitive constructs in TIMSS and PISA. Assessment in Education: Principles, Policy & Practice, pp. 1-17. DOI: 10.1080/0969594X.2018. 1469467; Journal homepage: http:// www.tandfonline.com/loi/caie20

Kaur, M., Singla, S. & Mahajan, R. (2016). Item analysis of in use multiple choice questions in pharmacology. Interna-tional journal of Applied Basic Medical Research, 2016, Vol. 6, Issue 3, pp. 170-173 Available from: http: // www.ijabmr.org/text.asp?2016/6/3/ 170/186965

Khoshaim, H.B. & Rashid, S. (2016). Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam. International Journal of Instruc-tion, January 2016, Vol.9, No.1, pp. 120-132 www.e-iji.net. DOI:10.12973/iji.201 6.9110a

McKenna, P. (2019) "Multiple choice questions: answering correctly and knowing the answer", Interactive Technology and Smart Education, https://doi.org/10.1108/ITSE-09-2018-0071; https://doi.org/10.1108/ITSE 09-2018-0071

Mehta, G. & Mokhasi, V. (2014). Item analysis of multiple choice questions- An assessment of the assessment tool. International Journal of Health Sciences and Research, Vol. 4, Issue 7, 2014, pp. 197-202. www. ijhsr.org

Mohajan, H. (2017). Two Criteria for Good Measurements in Research: Validity and Reliability. Annals of Spiru Haret University, 17(3), July 2017), pp. 58-82. https://mpra.ub.uni-muenchen.de/ 83458/

Namdeo, SK., & Sahoo, B. (2016). Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. International Journal of Research in Medical Sciences, 2016. 4(5), pp1716-1719. DOI: http://dx.doi.org/10.18203 /2320-6012.ijrms20161256;

Patnaik, D.S. & Davidson, L.M. (2015). The role of professional development in ensuring teacher quality. International Journal of English Language Teaching Vol.3, No.5, July 2015. pp.13-19. www.eajour nals.org

Quaigrain, K. & Arhin, A.K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evalua tion. Cogent Education (2017), 4: 1301013, pp. 1-11. http://dx.doi. org/10.1080/2331186X.2017.1301013

Rahma, A., Shamad, M., Idris, M. E. A., Elfaki, O., Elfakey, W., & Salih, K. M. A. (2017). Comparison in the quality of distractors in three and four options type of multiple choice questions. Advances in Medical Education and Practice, Volume 8, 287–291. Doi:10.2147 /amep.s128318

Rao, C., Prasad, K. H. L., Sajitha, K., Permi, H. & Shetty, J. (2016). Item analysis of multiple choice questions: Assessing an assessment tool in medical students. Interna-tional Journal of Educational and Psycho-logical Researches, Vol. 2, Issue 4, October-December 2016, pp. 201-204. http://www.ijeprjournal.org

Rauch, D.P. & Hartig, J. (2010). Multiple-choice versus open-ended response formats of reading test items: A two-dimensional IRT analysis. Psycho-logical Test and Assessment Modeling, Volume 52, 2010 (4), pp. 354-379

Rodriguez, M.C. (2005). Three options are optimal for multiple-choice items: a meta-analysis of 80 years of Research. Educational Measure-ment: Issues and Practice, pp. 1-13. https:// doi.org/10.1111/j.1745-3992.2005. 00006.x

Srivastava, A., Dhar, A. and Aggarwal, C.S. (2004), “Why MCQ”, Indian Journal of Surgery, Vol. 66, pp. 246-248.

Taherdoost, H. (2016). Validity and relia-bility of the research instrument; How to test the validation of a questionnaire /survey in a research. International Journal of Academic Research in Management (IJARM), Vol. 5, No. 3, 2016, pp. 28-36. www. elvedit.com

Wiliam, D. (2013). Assessment: The Bridge between Teaching and Learning. Voices from the Middle, Volume 21 Number 2, December 2013, pp. 15-20.


Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

View My Stats

Creative Commons License
The copyright is reserved to The AKSARA: Jurnal Bahasa dan Sastra that is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.