Keywords
conceptual error, attributes, junior high school mathematics national examination
Document Type
Article
Abstract
The goal of the research is to gain insights into the characteristics of the items in the mathematics national examination, the attributes on which the items were formulated and the result of a conceptual error diagnosis of the mathematics materials based on the result of the junior high school mathematics national examination. This is quantitative descriptive research. The data were collected from 3,079 grade-nine students of junior high schools who took the National Examination in the academic year of 2015/2016. The sample was established randomly based on the package code of the examination which is P0C5520 with 574 students as the examinees. Documentation method was applied in collecting the data. The result of the research shows that - upon the implementation of the classical test theory - there are 16 items in 'difficult' category, 24 in 'intermediate' category, and no items in 'easy' category. Furthermore, upon the implementation of the item response theory, the result shows that 28 items are in 'good' category and 12 items are in 'poor' category. In addition, there are 50 attributes on which the Junior High School Mathematics National Examination test (package P0C520) is formulated. Four attributes are content attributes and the rest (46) are process skill attributes. The result of the diagnosis shows that there are 11 types of errors made by the students when trying to complete the content items. Most of the errors are conceptual errors related to the geometric materials especially in the sub-materials of polyhedron, triangles, and quadrangles.
Page Range
163-173
Issue
2
Volume
3
Digital Object Identifier (DOI)
10.21831/reid.v3i2.18120
Source
https://journal.uny.ac.id/index.php/reid/article/view/18120
Recommended Citation
Kartianom, K., & Mardapi, D. (2017). The utilization of junior high school mathematics national examination data: A conceptual error diagnosis. REID (Research and Evaluation in Education), 3(2). https://doi.org/10.21831/reid.v3i2.18120
References
Abadyo, A., & Bastari, B. (2015). Estimation of ability and item parameters in mathematics testing by using the combination of 3PLM/GRM and MCM/GPCM scoring model. REiD (Research and Evaluation in Education), 1(1), 55-72.
Gierl, M. J., Cui, Y., & Zhou, J. (2009). Reliability and attribute-based scoring in cognitive diagnostic assessment. Journal of Educational Measurement, 46(3), 293-313. https://doi.org/10.1111/j.1745-3984.2009.00082.x
Gierl, M. J., Zheng, Y., & Cui, Y. (2008). Using the attribute hierarchy method to identify and interpret cognitive skills that produce group differences. Journal of Educational Measurement Spring, 45(1), 65-89. Retrieved from https://pdfs.semanticscholar.org/0a0b/180342ee51f6121dd4e3199c9cc4df3bc377.pdf
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. New Delhi: Sage Publications.
Isgiyanto, A. (2011). Diagnosis kesalahan siswa berbasis penskoran politomus model partial credit pada matematika. Jurnal Penelitian Dan Evaluasi Pendidikan, 15(2), 308-325. Retrieved from https://journal.uny.ac.id/index.php/jpep/article/view/1099/1151
Kartianom, K. (2017). Diagnosis kesalahan konsep materi matematika SMP berdasarkan hasil ujian nasional di kota Baubau. Master Thesis, Universitas Negeri Yogyakarta, Indonesia.
Kartianom, K., & Ndayizeye, O. (2017). What's wrong with the Asian and African students' mathematics learning achievement? The multilevel PISA 2015 data analysis for Indonesia, Japan, and Algeria. Jurnal Riset Pendidikan Matematika, 4(2), 200-210. https://doi.org/10.21831/jrpm.v4i2.16931
Leighton, J. P., & Gierl, M. J. (2007). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees' thinking processes. Educational Measurement: Issues and Practice, 26(2), 3-16. https://doi.org/10.1111/j.1745-3992.2007.00090.x
Mardapi, D. (2012). Pengukuran, penilaian, dan evaluasi pendidikan. Yogyakarta: Nuha Medika.
Ministry of Education and Culture. (2015). Laporan hasil ujian nasional. Jakarta: Balitbang.
Ravand, H., & Robitzsch, A. (2015). Cognitive diagnostic modeling using R. Practical Assessment, Research & Evaluation, 20(11). Retrieved from http://pareonline.net/getvn.asp?v=20&n=11
Retnawati, H. (2014). Teori respons butir dan penerapannya: Untuk peneliti, praktisi pengukuran dan pengujian, mahasiswa pascasarjana. Yogyakarta: Nuha Medika.
Retnawati, H. (2017). Diagnosing the junior high school students' difficulties in learning mathematics. International Journal on New Trends in Education and Their Implications, 8(1), 33-50. Retrieved from http://www.ijonte.org/FileUpload/ks63207/File/04.heri_retnawati.pdf
Retnawati, H., Munadi, S., & Al-Zuhdy, Y. A. (2015). Factor analysis to identify the dimension of Test of English Proficiency (TOEP) in the listening section. REiD (Research and Evaluation in Education), 1(1), 45-54. https://doi.org/10.21831/reid.v1i1.4897
Russell, M., O'Dwyer, L. M., & Miranda, H. (2009). Diagnosing students' misconceptions in algebra: Results from an experimental pilot study. Behavior Research Methods, 41(2), 414-424. https://doi.org/10.3758/BRM.41.2.414
Sumintono, B., & Widhiarso, W. (2015). Aplikasi pemodelan Rasch pada asesmen pendidikan. Bandung: Trim Komunikata.
Tatsuoka, K. K. (2009). Cognitive assessment: An introduction to the rule space method. New York, NY: Routledge/Taylor & Francis.
Wang, C., & Gierl, M. J. (2011). Using the attribute hierarchy method to make diagnostic inferences about examinees' cognitive skills in critical reading. Journal of Educational Measurement, 48(2), 165-187. https://doi.org/10.1111/j.1745-3984.2011.00142.x
Yamtinah, S., & Budiyono, B. (2015). Pengembangan instrumen diagnosis kesulitan belajar pada pembelajaran kimia di SMA. Jurnal Penelitian Dan Evaluasi Pendidikan, 19(1), 69-81. https://doi.org/10.21831/pep.v19i1.4557