Keywords
prospective science teacher, rubric, science experiment design skills
Document Type
Article
Abstract
The Indonesian government mandates that science teachers must have competence in designing science experiments for learning purposes so that science content can be learned optimally by students while preparing them to have the ability to face the 21st century. This is development research that aims to develop a measurement instrument for science experiment design skills for prospective science teachers that meets good psychometric characteristics. The rubric development procedure refers to the Churches rubric development method, which consists of four stages: define, design, do, and debrief, involving 10 experts (lecturers and teachers) and 124 prospective science teachers as research participants. The results of exploratory and confirmatory factor analysis showed that the analytical rubric developed by measuring ten aspects, namely title, research objectives, relevant theories, variables, materials, equipment and instrumentation, method, an appropriate number of data, references, and systematic and technical writing was valid in content (CVI=.96), valid in construct (GFI=.94; RMSEA=.071; NFI=.99; CFI=1.00; PNFI=.91), and reliable (α=.968). The use of a standardized rubric certainly allows the assessment to provide consistent, accurate, and objective results and helps students understand what competencies they must achieve.
First Page
32
Last Page
46
Issue
1
Volume
10
Digital Object Identifier (DOI)
10.21831/jipi.v10i1.65853
DOI Link
http://doi.org/10.21831/jipi.v10i1.65853
Recommended Citation
Amelia, R. N., Listiaji, P., Dewi, N. R., Heriyanti, A. P., Shoba, T. M., & Sajidi, I. (2024). Developing and Validating a Rubric for Measuring Skills in Designing Science Experiments for Prospective Science Teachers. Jurnal Inovasi Pendidikan IPA, 10(1), 32-46. https://doi.org/10.21831/jipi.v10i1.65853
References
Malik, R. S. (2018). Educational Challenges in 21st Century and Sustainable Development. Journal of Sustainable Development Education and Research (JSDER), 2(1), 9–20. ACAR GÜVENDİR, M., & ÖZER ÖZKAN, Y. (2022). Item Removal Strategies Conducted in Exploratory Factor Analysis: A Comparative Study. International Journal of Assessment Tools in Education, 9(1), 165–180. https://doi.org/10.21449/ijate.827950 Afthanorhan, A., Awang, Z., & Aimran, N. (2020). An extensive comparison of cb-sem and pls-sem for reliability and validity. International Journal of Data and Network Science, 4(4), 357–364. https://doi.org/10.5267/j.ijdns.2020.9.003 Álvarez, A., Martín, M., Fernández-Castro, I., & Urretavizcaya, M. (2013). Blending traditional teaching methods with learning environments: Experience, cyclical evaluation process and impact with MAgAdI. Computers and Education, 68, 129–140. https://doi.org/10.1016/j.compedu.2013.05.006 Bahtiar, B., Maimun, M., & Baiq Lily Anggriani W. (2022). Pengaruh Model Discovery Learning Melalui Kegiatan Praktikum IPA Terpadu Terhadap Kemampuan Berpikir Kritis Siswa. Jurnal Pendidikan Mipa, 12(2), 134–142. https://doi.org/10.37630/jpm.v12i2.564 Chen, P., Yang, D., Metwally, A. H. S., Lavonen, J., & Wang, X. (2023). Fostering computational thinking through unplugged activities: A systematic literature review and meta-analysis. International Journal of STEM Education, 10(1). https://doi.org/10.1186/s40594-023-00434-7 Cheung, G. W., Cooper-Thomas, H. D., Lau, R. S., & Wang, L. C. (2023). Reporting reliability, convergent and discriminant validity with structural equation modeling: A review and best-practice recommendations. In Asia Pacific Journal of Management (Issue 0123456789). Springer US. https://doi.org/10.1007/s10490-023-09871-y Cholifah, S. N., & Novita, D. (2022). Pengembangan E-LKPD Guided Inquiry-Liveworksheet untuk Meningkatkan Literasi Sains pada Submateri Faktor Laju Reaksi. Chemistry Education Practice, 5(1), 23–34. https://doi.org/10.29303/cep.v5i1.3280 Chowdhury, F. (2018). Application of Rubrics in the Classroom: A Vital Tool for Improvement in Assessment, Feedback and Learning. International Education Studies, 12(1), 61. https://doi.org/10.5539/ies.v12n1p61 Cooper, G. (2023). Examining Science Education in ChatGPT: An Exploratory Study of Generative Artificial Intelligence. Journal of Science Education and Technology, 32(3), 444–452. https://doi.org/10.1007/s10956-023-10039-y Creswell, J. W. (2014). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. SAGE Publications. Dash, G., & Paul, J. (2021). CB-SEM vs PLS-SEM methods for research in social sciences and technology forecasting. Technological Forecasting and Social Change, 173(August), 121092. https://doi.org/10.1016/j.techfore.2021.121092 Davy Tsz Kit, N. G., Luo, W., Chan, H. M. Y., & Chu, S. K. W. (2022). Using digital story writing as a pedagogy to develop AI literacy among primary students. Computers and Education: Artificial Intelligence, 3(October 2021), 100054. https://doi.org/10.1016/j.caeai.2022.100054 Dewi, C. A., Khery, Y., & Erna, M. (2019). An ethnoscience study in chemistry learning to develop scientific literacy. Jurnal Pendidikan IPA Indonesia, 8(2), 279–287. https://doi.org/10.15294/jpii.v8i2.19261 Dewi, Citra Ayu, Erna, M., Martini, Haris, I., & Kundera, I. N. (2021). Effect of Contextual Collaborative Learning Based Ethnoscience to Increase Student’s Scientific Literacy Ability. Journal of Turkish Science Education, 18(3), 525–541. https://doi.org/10.36681/tused.2021.88 Erduran, S., Ioannidou, O., & Baird, J. A. (2021). The impact of epistemic framing of teaching videos and summative assessments on students’ learning of scientific methods. International Journal of Science Education, 43(18), 2885–2910. https://doi.org/10.1080/09500693.2021.1998717 Fatiyah, H. N., Riandi, & Solihat, R. (2021). Development of learning tools education for sustainable development (ESD) integrated problem-solving for high school. Journal of Physics: Conference Series, 1806(1). https://doi.org/10.1088/1742-6596/1806/1/012157 Fields, D., Lui, D., Kafai, Y., Jayathirtha, G., Walker, J., & Shaw, M. (2021). Communicating about computational thinking: understanding affordances of portfolios for assessing high school students’ computational thinking and participation practices. Computer Science Education, 31(2), 224–258. https://doi.org/10.1080/08993408.2020.1866933 Freeman, L. A., & Jessup, L. M. (2004). The power and benefits of concept mapping: Measuring use, usefulness, ease of use, and satisfaction. International Journal of Science Education, 26(2), 151– 169. https://doi.org/10.1080/0950069032000097361 Gebremedhin, M., Gebrewahd, E., & Stafford, L. K. (2022). Validity and reliability study of clinician attitude towards rural health extension program in Ethiopia: exploratory and confirmatory factor analysis. BMC Health Services Research, 22(1), 1–10. https://doi.org/10.1186/s12913-022-08470- 9 Geldhof, G. J., Preacher, K. J., & Zyphur, M. J. (2014). Reliability estimation in a multilevel confirmatory factor analysis framework. Psychological Methods, 19(1), 72–91. https://doi.org/10.1037/a0032138 Gürses, A., Çetinkaya, S., Doğar, Ç., & Şahin, E. (2015). Determination of Levels of Use of Basic Process Skills of High School Students. Procedia - Social and Behavioral Sciences, 191, 644–650. https://doi.org/10.1016/j.sbspro.2015.04.243 Han, J., Park, D., Hua, M., & Childs, P. R. N. (2022). Is group work beneficial for producing creative designs in STEM design education? International Journal of Technology and Design Education, 32(5), 2801–2826. https://doi.org/10.1007/s10798-021-09709-y Heryani, T. P., Suwarma, I. R., & Chandra, D. T. (2023). Development of STEM-Based Physics Module with Self-Regulated Learning to Train Students Critical Thinking Skills. Jurnal Penelitian Pendidikan IPA, 9(6), 4245–4252. https://doi.org/10.29303/jppipa.v9i6.3578 Hettithanthri, U., Hansen, P., & Munasinghe, H. (2023). Exploring the architectural design process assisted in conventional design studio: a systematic literature review. International Journal of Technology and Design Education, 33(5), 1835–1859. https://doi.org/10.1007/s10798-022-09792- 9 Hidayat, R., Syed Zamri, S. N. A., & Zulnaidi, H. (2018). Exploratory and confirmatory factor analysis of achievement goals for indonesian students in mathematics education programmes. Eurasia Journal of Mathematics, Science and Technology Education, 14(12). https://doi.org/10.29333/ejmste/99173 Hofstein, A., & Mamlok-Naaman, R. (2007). The laboratory in science education: The state of the art. Chemistry Education Research and Practice, 8(2), 105–107. https://doi.org/10.1039/B7RP90003A INDRAWATI, M. (2017). Keefektifan Lembar Kerja Siswa (Lks) Berbasis Etnosains Pada Materi Bioteknologi Untuk Melatihkan Keterampilan Proses Sains Siswa Kelas Ix. Pensa: Jurnal Pendidikan Sains, 5(02), 152–158. Isbell, T., & Goomas, D. T. (2014). Computer-assisted rubric evaluation: Enhancing outcomes and assessment quality. Community College Journal of Research and Practice, 38(12), 1193–1197. https://doi.org/10.1080/10668926.2014.899526 Jayanti, M. I., & Nurfathurrahmah, N. (2023). Gerakan Penguatan Literasi Sains Melalui Praktikum Ipa Sederhana Di Smpn 11 Kota Bima. Taroa: Jurnal Pengabdian Masyarakat, 2(1), 1–8. https://doi.org/10.52266/taroa.v2i1.1220 Kanapeckas Métris, K. L. (2020). Activities and assessment solutions for students in advanced molecular genetics and biochemistry to direct and engage with public communication in an online environment. Biochemistry and Molecular Biology Education, 48(5), 439–441. https://doi.org/10.1002/bmb.21389 Kurniawati, A. (2021). Science Process Skills and Its Implementation in the Process of Science Learning Evaluation in Schools. Journal of Science Education Research, 5(2), 16–20. https://doi.org/10.21831/jser.v5i2.44269 Kurukunda, S., Trigona, C., & Baglio, S. (2020). Laboratory Activity during COVID-19 as a “Virtual Experience”: Restriction or Chance? Proceedings of the 17th International Multi-Conference on Systems, Signals and Devices, SSD 2020, 349–353. https://doi.org/10.1109/SSD49366.2020.9364113 Lawshe, C. H. (1975). A quantitative approach to content validity”.Personnel Psychology. Personnel Psychology, 28, 563–575. Ledesma, R. D., Valero-Mora, P., & Macbeth, G. (2015). The Scree Test and the Number of Factors: a Dynamic Graphics Approach. The Spanish Journal of Psychology, 18, E11. https://doi.org/10.1017/sjp.2015.13 Lee, H., Yoo, J., Choi, K., Kim, S. W., Krajcik, J., Herman, B. C., & Zeidler, D. L. (2013). Socioscientific Issues as a Vehicle for Promoting Character and Values for Global Citizens. International Journal of Science Education, 35(12), 2079–2113. https://doi.org/10.1080/09500693.2012.749546 Malik, R. S. (2018). Educational Challenges in 21St Century and Sustainable Development. Journal of Sustainable Development Education and Research, 2(1), 9. https://doi.org/10.17509/jsder.v2i1.12266 Mang, H. M. A., Chu, H. E., Martin, S. N., & Kim, C. J. (2023). Developing an Evaluation Rubric for Planning and Assessing SSI-Based STEAM Programs in Science Classrooms. Research in Science Education, 53(6), 1119–1144. https://doi.org/10.1007/s11165-023-10123-8 McDermott-Dalton, G. (2022). Putting the ‘e’ in portfolio design: an intervention research project investigating how design students and faculty might jointly reimagine the design portfolio activity. International Journal of Technology and Design Education, 32(2), 1207–1225. https://doi.org/10.1007/s10798-020-09640-8 Pahrudin, A., Irwandani, Triyana, E., Oktarisa, Y., & Anwar, C. (2019). The analysis of pre-service physics teachers in scientific literacy: Focus on the competence and knowledge aspects. Jurnal Pendidikan IPA Indonesia, 8(1), 52–62. https://doi.org/10.15294/jpii.v8i1.15728 Petrie, C. (2023). Design and use of domain-specific programming platforms: interdisciplinary computational thinking with EarSketch and TunePad. Computer Science Education, 00(00), 1–34. https://doi.org/10.1080/08993408.2023.2240657 Pino, M. E. M., Ordoñez, F. R. R., Ysa, R. A. S., Llanos, D. M. J., Cruz, M. M. T., & Calderón, B. A. C. (2023). Role of Expert in Validation of Information Collection Instruments for Business Purposes. International Journal of Professional Business Review, 8(8), e03122. https://doi.org/10.26668/businessreview/2023.v8i8.3122 Ploj Virtič, M. (2022). Teaching science & technology: components of scientific literacy and insight into the steps of research. International Journal of Science Education, 44(12), 1916–1931. https://doi.org/10.1080/09500693.2022.2105414 Purnamasari, S. (2020). Pengembangan Praktikum IPA Terpadu Tipe Webbed untuk Meningkatkan Keterampilan Proses Sains. PSEJ (Pancasakti Science Education Journal), 5(2), 8–15. https://doi.org/10.24905/psej.v5i2.20 Pursitasari, I. D., Permanasari, A., Rubini, B., & Ardianto, D. (2023). Pelatihan Penyusunan Desain Praktikum dan Penggunaan KIT Praktikum. Jurnal Abdinus: Jurnal Pengabdian Nusantara, 7(2), 516–530. Putri*, R. M., Asrizal, A., & Usmeldi, U. (2022). Metaanalisis Efek Pendekatan STEM pada Literasi Sains dan Pemahaman Konsep Peserta Didik di Setiap Satuan Pendidikan. Jurnal IPA & Pembelajaran IPA, 6(1), 86–98. https://doi.org/10.24815/jipi.v6i1.23897 Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment and Evaluation in Higher Education, 35(4), 435–448. https://doi.org/10.1080/02602930902862859 Rodriguez, S. L., Friedensen, R., Marron, T., & Bartlett, M. (2019). Latina Undergraduate Students in STEM: The Role of Religious Beliefs and STEM Identity. Journal of College and Character, 20(1), 25–46. https://doi.org/10.1080/2194587x.2018.1559198 Rukmini, D., & Saputri, L. A. D. E. (2017). The authentic assessment to measure students’ English productive skills based on 2013 Curriculum. Indonesian Journal of Applied Linguistics, 7(2), 263– 273. https://doi.org/10.17509/ijal.v7i2.8128 Rusilowati, A., Nugroho, S. E., Susilowati, E. S. M., Mustika, T., Harfiyani, N., & Prabowo, H. T. (2018). The development of scientific literacy assessment to measure student’s scientific literacy skills in energy theme. Journal of Physics: Conference Series, 983(1). https://doi.org/10.1088/1742-6596/983/1/012046 Sadler, D. R. (2009). Indeterminacy in the use of preset criteria for assessment and grading. Assessment and Evaluation in Higher Education, 34(2), 159–179. https://doi.org/10.1080/02602930801956059 Sari, P. M., & Zulfadewina, Z. (2020). Pengembangan Panduan Praktikum Berbasis Keterampilan Proses Sains Pada Mata Kuliah Praktikum Ipa Sd. Jurnal Pelita Pendidikan, 8(1), 94–98. https://doi.org/10.24114/jpp.v8i1.17334 Setchell, J. M. (2019). Writing a Scientific Report. Studying Primates, 271–298. https://doi.org/10.1017/9781108368513.023 Shana, Z., & Abulibdeh, E. S. (2020). Science practical work and its impact on students’ science achievement. Journal of Technology and Science Education, 10(2), 199–215. https://doi.org/10.3926/JOTSE.888 Sudaryanto, M., & Akbariski, H. S. (2021). Students’ competence in making language skill assessment rubric. REID (Research and Evaluation in Education), 7(2), 156–167. https://doi.org/10.21831/reid.v7i2.44005 Svenningsson, J., Höst, G., Hultén, M., & Hallström, J. (2022). Students’ attitudes toward technology: exploring the relationship among affective, cognitive and behavioral components of the attitude construct. International Journal of Technology and Design Education, 32(3), 1531–1551. https://doi.org/10.1007/s10798-021-09657-7 Timm, J. M., & Barth, M. (2021). Making education for sustainable development happen in elementary schools: the role of teachers. Environmental Education Research, 27(1), 50–66. https://doi.org/10.1080/13504622.2020.1813256 Viñas, L. F. (2022). Testing the Reliability of two Rubrics Used in Official English Certificates for the Assessment of Writing. Revista Alicantina de Estudios Ingleses, 36, 85–109. https://doi.org/10.14198/RAEI.2022.36.05 Wola, B. R., Rungkat, J. A., & Harindah, G. M. D. (2023). Science process skills of prospective science teachers’ in practicum activity at the laboratory. Jurnal Inovasi Pendidikan IPA, 9(1), 50–61. https://doi.org/10.21831/jipi.v9i1.52974 Wu, R. M. X., Zhang, Z., Zhang, H., Wang, Y., Shafiabady, N., Yan, W., Gou, J., Gide, E., & Zhang, S. (2023). An FSV analysis approach to verify the robustness of the triple-correlation analysis theoretical framework. Scientific Reports, 13(1), 1–20. https://doi.org/10.1038/s41598-023- 35900-3 Xia, Y., & Yang, Y. (2019). RMSEA, CFI, and TLI in structural equation modeling with ordered categorical data: The story they tell depends on the estimation methods. Behavior Research Methods, 51(1), 409–428. https://doi.org/10.3758/s13428-018-1055-2