A Systematic Review of the Use of Technology in Educational Assessment Practices: Lesson Learned and Direction for Future Studies

(1) * Heri Retnawati Mail (Universitas Negeri Yogyakarta, Indonesia)
(2) Elena Kardanova Mail (HSE University, Russian Federation)
(3) Sumaryanto Sumaryanto Mail (Universitas Negeri Yogyakarta, Indonesia)
(4) Lantip Diat Prasojo Mail (Universitas Negeri Yogyakarta, Indonesia)
(5) Jailani Jailani Mail (Universitas Negeri Yogyakarta, Indonesia)
(6) Elly Arliani Mail (Universitas Negeri Yogyakarta, Indonesia)
(7) Kana Hidayati Mail (Universitas Negeri Yogyakarta, Indonesia)
(8) Mathilda Susanti Mail (Universitas Negeri Yogyakarta, Indonesia)
(9) Himmawati Puji Lestari Mail (Universitas Negeri Yogyakarta, Indonesia)
(10) Ezi Apino Mail (Universitas Negeri Yogyakarta, Indonesia)
(11) Ibnu Rafi Mail (Universitas Negeri Yogyakarta, Indonesia)
(12) Munaya Nikma Rosyada Mail (Universitas Negeri Yogyakarta, Indonesia)
(13) Rugaya Tuanaya Mail (Universitas Negeri Yogyakarta, Indonesia)
(14) Septinda Rima Dewanti Mail (1) School of Early Childhood and Inclusive Education, Queensland University of Technology, Brisbane, Australia. 2) Department of Guidance and Counseling, Universitas Negeri Yogyakarta, Sleman, Indonesia)
(15) Rimajon Sotlikova Mail (Webster University in Tashkent, Uzbekistan)
(16) Gulzhaina Kuralbayevna Kassymova Mail (1) Institute of Pedagogy and Psychology, Abai Kazakh National Pedagogical University, Almaty, Kazakhstan. 2) Educational Leadership Program, Queen Margaret University, Edinburgh, United Kingdom)
*corresponding author

Abstract


Previous studies have demonstrated that technology helps achieve learning outcomes. However, many studies focus on just one aspect of technology’s role in educational assessment practices, leaving a gap in studies that examine how various aspects affect the use of technology in assessments. Hence, through a systematic work, we analyzed the extent and manner in which technology is integrated into educational assessments and how education level, domain of learning, and region may affect the use of technology. We reviewed empirical studies from two major databases (i.e., Scopus and ERIC) and a national journal whose focus and scope are on educational measurement and assessment, following PRISMA guidelines for systematic reviews. The findings of the present study are directed towards emphasizing the roles of technology in educational assessment practices and how these roles are adapted to varying educational contexts such as the level of education, the three domains of learning (i.e., cognitive, psychomotor, and affective), and the setting in which the assessment was conducted. These findings not only highlight the current roles of technology in educational assessment but also provide a roadmap for future research aimed at optimizing the integration of technology across diverse educational contexts.

Keywords


Use of Technology; Test Administration; Giving Feedback; Test Scoring; Test Item Generation; Response Time Recording; Educational Assessment; Systematic Review

   

DOI

https://doi.org/10.31763/ijrcs.v4i4.1572
      

Article metrics

10.31763/ijrcs.v4i4.1572 Abstract views : 880 | PDF views : 262

   

Cite

   

Full Text

Download

References


[1] U. Karimah, H. Retnawati, D. Hadiana, P. Pujiastuti, E. Yusron, “The characteristics of chemistry test items on nationally-standardized school examination in Yogyakarta City,†REID (Research and Evaluation in Education), vol. 7, no. 1, pp. 1-12, 2021, https://doi.org/10.21831/reid.v7i1.31297.
[2] I. Rafi, H. Retnawati, E. Apino, D. Hadiana, I. Lydiati, and M. N. Rosyada, “What might be frequently overlooked is actually still beneficial: Learning from post national-standardized school examination,†Pedagogical Research, vol. 8, no. 1, pp. 1-15, 2023, https://doi.org/10.29333/pr/12657.
[3] H. Retnawati, B. Kartowagiran, J. Arlinwibowo, and E. Sulistyaningsih, “Why are the mathematics national examination items difficult and what is teachers’ strategy to overcome it?,†International Journal of Instruction, vol. 10, no. 3, pp. 257-276, 2017, https://doi.org/10.12973/iji.2017.10317a.
[4] I. A. M. S. Widiastuti, “Assessment and feedback practices in the EFL classroom,†REID (Research and Evaluation in Education), vol. 7, no. 1, pp. 13-22, 2021, https://doi.org/10.21831/reid.v7i1.37741.
[5] M. Zhou, “Commentary: Significance of assessment in learning: The role of educational assessment tools,†Science Insights Education Frontiers, vol. 18, no. 2, pp. 2881-2883, 2023, https://doi.org/10.15354/sief.23.co215.
[6] R. Rukli and N. A. Atan, “Simulation of low-high method in adaptive testing,†REID (Research and Evaluation in Education), vol. 10, no. 1, pp. 35-49, 2024, https://doi.org/10.21831/reid.v10i1.66922.
[7] N. Kania, Y. S. Kusumah, J. A. Dahlan, E. Nurlaelah, F. Gürbüz, and E. Bonyah, “Constructing and providing content validity evidence through the Aiken’s V index based on the experts’ judgments of the instrument to measure mathematical problem-solving skills,†REID (Research and Evaluation in Education), vol. 10, no. 1, pp. 64-79, 2024, https://doi.org/10.21831/reid.v10i1.71032.
[8] R. Setiawan, W. Wagiran, and Y. Alsamiri, “Construction of an instrument for evaluating the teaching process in higher education: Content and construct validity,†REID (Research and Evaluation in Education), vol. 10, no. 1, pp. 50-63, 2024, https://doi.org/10.21831/reid.v10i1.63483.
[9] L. W. K. Yim, C. Y. Lye, and P. W. Koh, “A psychometric evaluation of an item bank for an English reading comprehension tool using Rasch analysis,†REID (Research and Evaluation in Education), vol. 10, no. 1, pp. 18-34, 2024, https://doi.org/10.21831/reid.v10i1.65284.
[10] N. H. Assa’diyah and S. Hadi, “Developing student character assessment questionnaire on French subject in state high schools,†REID (Research and Evaluation in Education), vol. 7, no. 2, pp. 168-176, 2021, https://doi.org/10.21831/reid.v7i2.43196.
[11] M. A. Hidayah and F. A. Setiawati, “Developing a character assessment instrument based on the school culture,†REID (Research and Evaluation in Education), vol. 8, no. 2, pp. 90-99, 2022, https://doi.org/10.21831/reid.v8i2.46802.
[12] U. Faizah, D. Zuchdi, and Y. Alsamiri, “An authentic assessment model to assess kindergarten students’ character,†REID (Research and Evaluation in Education), vol. 5, no. 2, pp. 103-119, 2019, https://doi.org/10.21831/reid.v5i2.24588.
[13] D. Hadiana, B. Hayat, and B. Tola, “Comparison of methods for detecting anomalous behavior on large-scale computer-based exams based on response time and responses,†REID (Research and Evaluation in Education), vol. 6, no. 2, pp. 87-97, 2020, https://doi.org/10.21831/reid.v6i2.31260.
[14] D. S. Ciptaningrum, N. H. P. S. Putro, N. K. Sari, and N. Hasanuddin, “Evaluation of learning process: Knowledge of ICT integration among pre-service English language teachers,†REID (Research and Evaluation in Education), vol. 7, no. 1, pp. 46-56, 2021, https://doi.org/10.21831/reid.v7i1.30521.
[15] I. Ismiyati, H. Retnawati, S. Suranto, H. Haryanto, M. Sholihah, and T. Tusyanah, “The readiness of prospective teachers based on online teaching competencies and learning activities in COVID-19 pandemic: A cluster analysis-based approach,†REID (Research and Evaluation in Education), vol. 8, no. 2, pp. 127-139, 2022, https://doi.org/10.21831/reid.v8i2.45576.
[16] B. H. See, S. Gorard, B. Lu, L. Dong, and N. Siddiqui, “Is technology always helpful?: A critical review of the impact on learning outcomes of education technology in supporting formative assessment in schools,†Research Papers in Education, vol. 37, no. 6, pp. 1064-1096, 2022, https://doi.org/10.1080/02671522.2021.1907778.
[17] K. J. Carstens, J. M. Mallon, M. Bataineh, and A. Al-Bataineh, “Effects of technology on student learning,†Turkish Online Journal of Educational Technology, vol. 20, no. 1, pp. 105–113, 2021, https://eric.ed.gov/?id=EJ1290791.
[18] H. Akram, A. H. Abdelrady, A. S. Al-Adwan, and M. Ramzan, “Teachers’ perceptions of technology integration in teaching-learning practices: A systematic review,†Frontiers in Psychology, vol. 13, pp. 1-9, 2022, https://doi.org/10.3389/fpsyg.2022.920317.
[19] J. Arlinwibowo, H. Retnawati, and B. Kartowagiran, “The impact of ICT utilization to improve the learning outcome: A meta-analysis,†International Journal of Evaluation and Research in Education (IJERE), vol. 11, no. 2, pp. 522-531, 2022, https://doi.org/10.11591/ijere.v11i2.22112.
[20] Ã. A. Jiménez Sierra, J. M. Ortega Iglesias, J. Cabero-Almenara, and A. Palacios-Rodríguez, “Development of the teacher’s technological pedagogical content knowledge (TPACK) from the Lesson Study: A systematic review,†Frontiers in Education, vol. 8, pp. 1–11, 2023, https://doi.org/10.3389/feduc.2023.1078913.
[21] Y. D. Kristanto, “Technology-enhanced pre-instructional peer assessment: Exploring students’ perceptions in a statistical methods course,†REID (Research and Evaluation in Education), vol. 4, no. 2, pp. 105-116, 2018, https://doi.org/10.21831/reid.v4i2.20951.
[22] P. Sivananda and A. A. Aziz, “Utilizing technology to promote active learning: A systematic literature review,†International Journal of Academic Research in Progressive Education and Development, vol. 10, no. 3, pp. 784-803, 2021, https://doi.org/10.6007/IJARPED/v10-i3/10815.
[23] S. Timotheou et al., “Impacts of digital technologies on education and factors influencing schools’ digital capacity and transformation: A literature review,†Education and Information Technologies, vol. 28, pp. 6695-6726, 2023, https://doi.org/10.1007/s10639-022-11431-8.
[24] H. Setiawan and S. Phillipson, “The effectiveness of game-based science learning (GBSL) to improve students’ academic achievement: A meta-analysis of current research from 2010 to 2017,†REID (Research and Evaluation in Education), vol. 5, no. 2, pp. 152-168, 2019, https://doi.org/10.21831/reid.v5i2.28073.
[25] C. A. Purdescu, “The quantification of the time saved by the professors through the introduction of the electronic evaluation,†International Conference of Management and Industrial Engineering, vol. 11, pp. 175-182, 2023, https://doi.org/10.56177/11icmie2023.33.
[26] H. Retnawati, “Learning trajectory of item response theory course using multiple softwares,†Olympiads in Informatics, vol. 11, pp. 123-142, 2017, https://doi.org/10.15388/ioi.2017.10.
[27] M. M. Neumann, J. L. Anthony, N. A. Erazo, and D. L. Neumann, “Assessment and technology: Mapping future directions in the early childhood classroom,†Frontiers in Education, vol. 4, p. 116, 2019, https://doi.org/10.3389/feduc.2019.00116.
[28] E. Eliaumra, D. P. Samaela, and N. K. Muhdin, “Developing diagnostic test assessment to measure creative thinking skills of Biology preservice teacher students,†REID (Research and Evaluation in Education), vol. 8, no. 2, pp. 152-168, 2022, https://doi.org/10.21831/reid.v8i2.50885.
[29] T. Fan, J. Song, and Z. Guan, “Integrating diagnostic assessment into curriculum: A theoretical framework and teaching practices,†Language Testing in Asia, vol. 11, no. 1, pp. 1-23, 2021, https://doi.org/10.1186/s40468-020-00117-y.
[30] C. N. Blundell, “Teacher use of digital technologies for school-based assessment: A scoping review,†Assessment in Education: Principles, Policy & Practice, vol. 28, no. 3, pp. 279-300, 2021, https://doi.org/10.1080/0969594X.2021.1929828.
[31] N. H. C. Hashim and K. Osman, “Teaching and learning by using online application during movement control order,†International Journal of Academic Research in Progressive Education and Development, vol. 10, no. 2, pp. 605-614, 2021, https://doi.org/10.6007/IJARPED/v10-i2/10143.
[32] M. Muhardis, B. Tola, and H. Haribowo, “The respondent factors on the digital questionnaire responses,†REID (Research and Evaluation in Education), vol. 5, no. 2, pp. 144-151, 2019, https://doi.org/10.21831/reid.v5i2.26943.
[33] F. A. A. Maryo and E. Pujiastuti, “Gamification in EFL class using Quizizz as an assessment tool,†Proceedings of Digital Literacy in Education and Science, vol. 3, pp. 75-80, 2022, https://doi.org/10.30595/pspfs.v3i.268.
[34] T. Hussein, M. Nat, H. F. Hasan, and A. Mahdi, “Development and evaluation of an online gamified assessment environment,†Proccedings of the 4th International Conference on Communication Engineering and Computer Science, pp. 189–197, 2022, https://doi.org/10.24086/cocos2022/paper.744.
[35] P. Chakraborty, N. P. Kuruvatti and H. D. Schotten, "A novel serious game engineering based interactive visualization and evaluation platform for cellular technologies," 2017 International Symposium on Networks, Computers and Communications (ISNCC), pp. 1-6, 2017, https://doi.org/10.1109/ISNCC.2017.8072012.
[36] A. F. C. D. Carmo, M. H. Shimabukuro, and E. H. D. Alcantara, “Using visual analytics techniques to evaluate the data quality in environmental datasets,†Boletim de Ciências Geodésicas, vol. 22, no. 3, pp. 542-556, 2016, https://doi.org/10.1590/S1982-21702016000300031.
[37] P. Vittorini, S. Menini, and S. Tonelli, “An AI-based system for formative and summative assessment in data science courses,†International Journal of Artificial Intelligence in Education, vol. 31, pp. 159-185, 2021, https://doi.org/10.1007/s40593-020-00230-2.
[38] N. H. Aswanti and W. Isnaeni, “Analysis of critical thinking skills, cognitive learning outcomes, and student activities in learning the human excretory system using an interactive flipbook,†REID (Research and Evaluation in Education), vol. 9, no. 1, pp. 37-48, 2023, https://doi.org/10.21831/reid.v9i1.53126.
[39] N. R. Hoover and L. M. Abrams, “Teachers’ instructional use of summative student assessment data,†Applied Measurement in Education, vol. 26, no. 3, pp. 219-231, 2013, https://doi.org/10.1080/08957347.2013.793187.
[40] M. Ulwatunnisa, H. Retnawati, M. Muhardis, and E. Yusron, “Revealing the characteristics of Indonesian language test used in the national-standardized school examinations,†REID (Research and Evaluation in Education), vol. 9, no. 2, pp. 210-222, 2023, https://doi.org/10.21831/reid.v9i2.31999.
[41] D. Chen, A. Jeng, S. Sun, and B. Kaptur, “Use of technology-based assessments: A systematic review covering over 30 countries,†Assessment in Education: Principles, Policy & Practice, vol. 30, no. 5-6, pp. 396-428, 2023, https://doi.org/10.1080/0969594X.2023.2270181.
[42] O. T. Akintayo, C. A. Eden, O. O. Ayeni, and N. C. Onyebuchi, “Evaluating the impact of educational technology on learning outcomes in the higher education sector: A systematic review,†Open Access Research Journal of Multidisciplinary Studies, vol. 7, no. 2, pp. 52-72, 2024, https://doi.org/10.53022/oarjms.2024.7.2.0026.
[43] K. Mangaroska, R. Tahir, M. Lorås and A. Mavroudi, "What do We Know about Learner Assessment in Technology-Rich Environments? A Systematic Review of Systematic Reviews," 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT), pp. 16-20, 2018, https://doi.org/10.1109/ICALT.2018.00010.
[44] Q. Lin, Y. Yin, X. Tang, R. Hadad, and X. Zhai, “Assessing learning in technology-rich maker activities: A systematic review of empirical research,†Computers & Education, vol. 157, p. 103944, 2020, https://doi.org/10.1016/j.compedu.2020.103944.
[45] C. Madland, V. Irvine, C. DeLuca, and O. Bulut, “Technology-integrated assessment: A literature review,†Open/Technology in Education, Society, and Scholarship Association Journal, vol. 4, no. 1, pp. 1-48, 2024, https://doi.org/10.18357/otessaj.2024.4.1.57.
[46] L. S. Uman, “Systematic reviews and meta-analyses,†Journal of the Canadian Academy of Child and Adolescent Psychiatry, vol. 20, no. 1, pp. 57-59, 2011, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3024725/.
[47] M. J. Page et al., “The PRISMA 2020 statement: An updated guideline for reporting systematic reviews,†bmj, vol. 372, pp. 1-9, 2021, https://doi.org/10.1136/bmj.n71.
[48] A. Carrera-Rivera, W. Ochoa, F. Larrinaga, and G. Lasa, “How-to conduct a systematic literature review: A quick guide for computer science research,†MethodsX, vol. 9, pp. 1-12, 2022, https://doi.org/10.1016/j.mex.2022.101895.
[49] J. Baas, M. Schotten, A. Plume, G. Côté, and R. Karimi, “Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies,†Quantitative Science Studies, vol. 1, no. 1, pp. 377-386, 2020, https://doi.org/10.1162/qss_a_00019.
[50] J. Strayer, “ERIC database alternatives and strategies for education researchers,†Reference Services Review, vol. 36, no. 1, pp. 86-96, 2008, https://doi.org/10.1108/00907320810852050.
[51] T. Wright and S. Pullen, “Examining the literature: A bibliometric study of ESD journal articles in the Education Resources Information Center database,†Journal of Education for Sustainable Development, vol. 1, no. 1, pp. 77–90, 2007, https://doi.org/10.1177/097340820700100114.
[52] T. Azungah, “Qualitative research: Deductive and inductive approaches to data analysis,†Qualitative Research Journal, vol. 18, no. 4, pp. 383-400, 2018, https://doi.org/10.1108/qrj-d-18-00035.
[53] K. Krippendorff, “Reliability in content analysis: Some common misconceptions and recommendations,†Human Communication Research, vol. 30, no. 3, pp. 411-433, 2004, https://doi.org/10.1111/j.1468-2958.2004.tb00738.x.
[54] M. L. McHugh, “Interrater reliability: The kappa statistic,†Biochemia Medica, vol. 22, no. 3, pp. 276–282, 2012, https://doi.org/10.11613/BM.2012.031.
[55] Z. Xu, B. Zhou, Z. Yang, X. Yuan, Y. Zhang, and Q. Lu, “NeatSankey: Sankey diagrams with improved readability based on node positioning and edge bundling,†Computers & Graphics, vol. 113, pp. 10-20, 2023, https://doi.org/10.1016/j.cag.2023.05.001.
[56] A. H. Al-Maqbali and A. Al-Shamsi, “Assessment strategies in online learning environments during the COVID-19 pandemic in Oman,†Journal of University Teaching and Learning Practice, vol. 20, no. 5, pp. 1-21, 2023, https://doi.org/10.53761/1.20.5.08.
[57] J. Andrews-Todd, J. Steinberg, M. Flor, and C. M. Forsyth, “Exploring automated classification approaches to advance the assessment of collaborative problem solving skills,†Journal of Intelligence, vol. 10, pp. 1-24, 2022, https://doi.org/10.3390/jintelligence10030039.
[58] T. Bleckmann and G. Friege, “Concept maps for formative assessment: Creation and implementation of an automatic and intelligent evaluation method,†Knowledge Management & E-Learning, vol. 15, no. 3, pp. 433-447, 2023, https://doi.org/10.34105/j.kmel.2023.15.025.
[59] Y.-H. Cheng, “E-portfolios in EFL writing: Benefits and challenges,†Language Education & Assessment, vol. 5, no. 1, pp. 52-70, 2022, https://doi.org/10.29140/lea.v5n1.815.
[60] S. Chinjunthuk, P. Junpeng, and K. N. Tang, “Use of digital learning platform in diagnosing seventh grade students’ mathematical ability levels,†Journal of Education and Learning, vol. 11, no. 3, pp. 95-104, 2022, https://doi.org/10.5539/jel.v11n3p95.
[61] U. Durrani, O. Hujran, and A. S. Al-Adwan, “CrossQuestion game: A group-based assessment for gamified flipped classroom experience using the ARCS model,†Contemporary Educational Technology, vol. 14, no. 2, pp. 1-15, 2022, https://doi.org/10.30935/cedtech/11568.
[62] F. Friyatmi, D. Mardapi, H. Haryanto, and E. Rahmi, “The development of computerized economics item banking for classroom and school-based assessment,†European Journal of Educational Research, vol. 9, no. 1, pp. 293-303, 2020, https://doi.org/10.12973/eu-jer.9.1.293.
[63] D. González-Gómez, J. S. Jeong, and F. Cañada-Cañada, “Examining the effect of an online formative assessment tool (OFAT) of students’ motivation and achievement for a university science education,†Journal of Baltic Science Education, vol. 19, no. 3, pp. 401-414, 2020, https://doi.org/10.33225/jbse/20.19.401.
[64] H. Marlianawati, S. Suranto, and B. Kartowagiran, “Smartphone application for assessing teacher performance,†REID (Research and Evaluation in Education), vol. 9, no. 2, pp. 118-129, 2023, https://doi.org/10.21831/reid.v9i2.52384.
[65] S. Senel and H. C. Senel, “Use of take-home exam for remote assessment: A case study from Turkey,†Journal of Educational Technology and Online Learning, vol. 4, no. 2, pp. 236-255, 2021, https://doi.org/10.31681/jetol.912965.
[66] E. J. Siwi, R. Anindyarini, and S. Nahar, “Item parameters of Yureka Education Center (YEC) English Proficiency Online Test (EPOT) instrument,†REID (Research and Evaluation in Education), vol. 6, no. 1, pp. 51-65, 2020, https://doi.org/10.21831/reid.v6i1.31013.
[67] I. Suhardi, “Alternative item selection strategies for improving test security in computerized adaptive testing of the algorithm,†REID (Research and Evaluation in Education), vol. 6, no. 1, pp. 32-40, 2020, https://doi.org/10.21831/reid.v6i1.30508.
[68] A. Tuluk and H. Yurdugul, “Design and development of a web based dynamic assessment system to increase students’ learning effectiveness,†International Journal of Assessment Tools in Education, vol. 7, no. 4, pp. 631-656, 2020, https://doi.org/10.21449/ijate.730454.
[69] D. Yao, “A comparative study of test takers’ performance on computer-based test and paper-based test across different CEFR levels,†English Language Teaching, vol. 13, no. 1, pp. 124-133, 2020, https://doi.org/10.5539/elt.v13n1p124.
[70] C. Cutting and K. Larkin, “The impact of weekly formative video feedback on pre-service teachers’ experiences in online mathematics education,†Mathematics Teacher Education and Development, vol. 23, no. 1, pp. 74-90, 2021, https://eric.ed.gov/?id=EJ1295257.
[71] G. T. L. Hoang, “Feedback precision and learners’ responses: A study into ETS ‘criterion’ automated corrective feedback in EFL writing classrooms,†JALT CALL (Japan Association for Language Teaching Computer Assisted Language Learning) Journal, vol. 18, no. 3, pp. 444–467, 2022, https://doi.org/10.29140/jaltcall.v18n3.775.
[72] J. S. Jeong, D. González-Gómez, and F. Yllana Prieto, “Sustainable and flipped STEM education: Formative assessment online interface for observing pre-service teachers’ performance and motivation,†Education Sciences, vol. 10, pp. 1-14, 2020, https://doi.org/10.3390/educsci10100283.
[73] S. Koltovskaia and S. Mahapatra, “Student engagement with computermediated teacher written corrective feedback: A case study,†JALT CALL (Japan Association for Language Teaching Computer Assisted Language Learning) Journal, vol. 18, no. 2, pp. 286-315, 2022, https://doi.org/10.29140/jaltcall.v18n2.519.
[74] S. Kusairi, “A web-based formative feedback system development by utilizing isomorphic multiple choice items to support physics teaching and learning,†Journal of Technology and Science Education, vol. 10, no. 1, pp. 117-126, 2020, https://doi.org/10.3926/jotse.781.
[75] M. Mohammadi, M. Zarrabi, and J. Kamali, “Formative assessment feedback to enhance the writing performance of Iranian IELTS candidates: Blending teacher and automated writing evaluation,†International Journal of Language Testing, vol. 13, no. 1, pp. 206-224, 2023, https://doi.org/10.22034/ijlt.2022.364072.1201.
[76] M. Olivera-Aguilar, H.-S. Lee, A. Pallant, V. Belur, M. Mulholland, O. L. Liu, “Comparing the effect of contextualized versus generic automated feedback on students’ scientific argumentation,†ETS Research Report Series, vol. 22, no. 1, pp. 1-14, 2022, https://doi.org/10.1002/ets2.12344.
[77] A. Pásztor, A. Magyar, A. Pásztor-Kovács, and A. Rausch, “Online assessment and game-based development of inductive reasoning,†Journal of Intelligence, vol. 10, no. 3, p. 59, 2022, https://doi.org/10.3390/jintelligence10030059.
[78] P. Senadheera and G. U. Kulasekara, “A formative assessment design suitable for online learning environments and its impact on students’ learning,†Open Praxis, vol. 13, no. 4, pp. 385-396, 2021, https://doi.org/10.55982/openpraxis.13.4.261.
[79] F. Toma, D. C. Diaconu, and C. M. Popescu, “The use of the Kahoot! learning platform as a type of formative assessment in the context of pre-university education during the COVID-19 pandemic period,†Education Sciences, vol. 11, no. 10, p. 649, 2021, https://doi.org/10.3390/educsci11100649.
[80] I. Weir, R. Gwynllyw, and K. Henderson, “A case study in the e-assessment of statistics for nonspecialists,†Journal of University Teaching and Learning Practice, vol. 18, no. 2, pp. 1–20, 2021, https://doi.org/10.53761/1.18.1.5.
[81] R. R. Wolf and A. B. Wolf, “Using AI to evaluate a competency-based online writing course in nursing,†Online Learning, vol. 27, no. 3, pp. 41-69, 2023, https://doi.org/10.24059/olj.v27i3.3974.
[82] L. Xian, “The effectiveness of dynamic assessment in linguistic accuracy in EFL writing: An investigation assisted by online scoring systems,†Language Teaching Research Quarterly, vol. 18, pp. 98–114, 2020, https://doi.org/10.32038/ltrq.2020.18.07.
[83] N. Yarahmadzehi and M. Goodarzi, “Investigating the role of formative mobile based assessment in vocabulary learning of pre-intermediate EFL learners in comparison with paper based assessment,†Turkish Online Journal of Distance Education, vol. 21, no. 1, pp. 181-196, 2020, https://doi.org/10.17718/tojde.690390.
[84] S. S. Yilmaz, “Investigation of traditional and web 2.0 supported (a sample of kahoot) formative assessment and evaluation in science course,†Shanlax International Journal of Education, vol. 11, no. S1, pp. 145-152, 2023, https://doi.org/10.34293/education.v11iS1-Oct.6689.
[85] A. S. Alharbi and Z. Meccawy, “Introducing socrative as a tool for formative assessment in saudi EFL classrooms,†Arab World English Journal, vol. 11, no. 3, pp. 372-384, 2020, https://doi.org/10.24093/awej/vol11no3.23.
[86] P. Auphan, J. Ecalle, and A. Magnan, “The high potential of computer-based reading assessment,†Canadian Journal of Learning and Technology, vol. 46, no. 1, pp. 1-23, 2020, https://doi.org/10.21432/cjlt27847.
[87] A. Barana, M. Marchisio, and M. Sacchet, “Interactive feedback for learning mathematics in a digital learning environment,†Education Sciences, vol. 11, no. 6, p. 279, 2021, https://doi.org/10.3390/educsci11060279.
[88] P. Daniels, “Auto-scoring of student speech: Proprietary vs. open-source solutions,†TESL-EJ: The Electronic Journal for English as a Second Language, vol. 26, no. 3, pp. 1-20, 2022, https://doi.org/10.55593/ej.26103int.
[89] H. Dasher and J. Pilgrim, “Paper vs. online assessments: A study of test-taking strategies for staar reading tests,†Texas Journal of Literacy Education, vol. 9, no. 3, pp. 7-20, 2022, https://orcid.org/0000-0003-3407-8562.
[90] J. M. Faro et al., “Video-based communication assessment for weight management counseling training in medical residents: a mixed methods study,†BMC Medical Education, vol. 22, no. 1, pp. 1–11, 2022, https://doi.org/10.1186/s12909-022-03984-6.
[91] I. Isnani, W. B. Utami, P. Susongko, and H. T. Lestiani, “Estimation of college students’ ability on real analysis course using Rasch model,†REID (Research and Evaluation in Education), vol. 5, no. 2, pp. 95-102, 2019, https://doi.org/10.21831/reid.v5i2.20924.
[92] W. Kaiss, K. Mansouri, and F. Poirier, “Pre-evaluation with a personalized feedback conversational agent integrated in Moodle,†International Journal of Emerging Technologies in Learning (iJET), vol. 18, no. 6, pp. 177-189, 2023, https://doi.org/10.3991/ijet.v18i06.36783.
[93] M. Lubrick and B. Wellington, “Formative learning assessment with online quizzing: Comparing target performance grade and best performance grade approaches,†Journal of Learning and Teaching in Digital Age, vol. 7, no. 2, pp. 297-306, 2022, https://doi.org/10.53850/joltida.1036295.
[94] S. H. Mercer and J. E. Cannon, “Validity of automated learning progress assessment in english written expression for students with learning difficulties,†Journal for Educational Research Online, vol. 14, no. 1, pp. 39-60, 2022, https://doi.org/10.31244/jero.2022.01.03.
[95] L. G. Otaya, B. Kartowagiran, H. Retnawati, and S. S. Mustakim, “Estimating the ability of pre-service and in-service Teacher Profession Education (TPE) participants using item response theory,†REID (Research and Evaluation in Education), vol. 6, no. 2, pp. 160-173, 2020, https://doi.org/10.21831/reid.v6i2.36043.
[96] R. Phoophuangpairoj and P. Pipattarasakul, “Preliminary indicators of EFL essay writing for teachers’ feedback using automatic text analysis,†International Journal of Educational Methodology, vol. 8, no. 1, pp. 55-68, 2022, https://doi.org/10.12973/ijem.8.1.55.
[97] M. D. Sahin and S. Gelbal, “Development of a multidimensional computerized adaptive test based on the bifactor model,†International Journal of Assessment Tools in Education, vol. 7, no. 3, pp. 323-342, 2020, https://doi.org/10.21449/ijate.707199.
[98] P. Zhao, C.-Y. Chang, Y. Shao, Z. Liu, H. Zhou, and J. Liu, “Utilization of process data in China: Exploring students’ problem-solving strategies in computer-based science assessment featuring interactive tasks,†Journal of Baltic Science Education, vol. 22, no. 5, pp. 929-944, 2023, https://doi.org/10.33225/jbse/23.22.929.
[99] S. Abdullah, W. Warsiyah, and J. Ju’subaidi, “Developing a religiosity scale for Indonesian Muslim youth,†REID (Research and Evaluation in Education), vol. 9, no. 1, pp. 73-85, 2023, https://doi.org/10.21831/reid.v9i1.61201.
[100] B. A. Dunya, C. McKown, and E. Smith, “Psychometric properties and differential item functioning of a web-based assessment of children’s emotion recognition skill,†Journal of Psychoeducational Assessment, vol. 38, no. 5, pp. 627-641, 2020, https://doi.org/10.1177/0734282919881919.
[101] I. K. Amalina and T. Vidákovich, “An integrated STEM-based mathematical problem-solving test: Developing and reporting psychometric evidence,†Journal on Mathematics Education, vol. 13, no. 4, pp. 587-604, 2022, https://doi.org/10.22342/jme.v13i4.pp587-604.
[102] M. Berlian, I. M. Mujtahid, R. Vebrianto, and M. Thahir, “Multiple intelligences instrument development: Identification system of multiple intelligences tutor,†REID (Research and Evaluation in Education), vol. 6, no. 2, pp. 119-129, 2020, https://doi.org/10.21831/reid.v6i2.35120.
[103] R. W. Daryono, V. L. Hariyanto, H. Usman, and S. Sutarto, “Factor analysis: Competency framework for measuring student achievements of architectural engineering education in Indonesia,†REID (Research and Evaluation in Education), vol. 6, no. 2, pp. 98-108, 2020, https://doi.org/10.21831/reid.v6i2.32743.
[104] H. H. Dewi, S. M. Damio, and S. Sukarno, “Item analysis of reading comprehension questions for English proficiency test using Rasch model,†REID (Research and Evaluation in Education), vol. 9, no. 1, pp. 24-36, 2023, https://doi.org/10.21831/reid.v9i1.53514.
[105] R. R. Fardhila and E. Istiyono, “An assessment instrument of mind map product to assess students’ creative thinking skill,†REID (Research and Evaluation in Education), vol. 5, no. 1, pp. 41-53, 2019, https://doi.org/10.21831/reid.v5i1.22525.
[106] S. Farida and F. A. Setiawati, “Developing assessment instruments of debate practice in Indonesian Language learning,†REID (Research and Evaluation in Education), vol. 7, no. 2, pp. 145-155, 2021, https://doi.org/10.21831/reid.v7i2.43338.
[107] K. N. Fathiyah, A. Alsa, and D. Setiyawati, “Psychometric characteristic of positive affect scale within the academic setting,†REID (Research and Evaluation in Education), vol. 5, no. 2, pp. 120-129, 2019, https://doi.org/10.21831/reid.v5i2.25992.
[108] I. W. Gunartha, T. Sulaiman, S. P. Suardiman, and B. Kartowagiran, “Developing instruments for measuring the level of early childhood development,†REID (Research and Evaluation in Education), vol. 6, no. 1, pp. 1-9, 2020, https://doi.org/10.21831/reid.v6i1.21996.
[109] R. M. Lia, A. Rusilowati, and W. Isnaeni, “NGSS-oriented chemistry test instruments: Validity and reliability analysis with the Rasch model,†REID (Research and Evaluation in Education), vol. 6, no. 1, pp. 41-50, 2020, https://doi.org/10.21831/reid.v6i1.30112.
[110] M. Muchlisin, D. Mardapi, and F. A. Setiawati, “An analysis of Javanese language test characteristic using the Rasch model in R program,†REID (Research and Evaluation in Education), vol. 5, no. 1, pp. 61-74, 2019, https://doi.org/10.21831/reid.v5i1.23773.
[111] A. Nurrahman, S. Sukirno, D. S. Pratiwi, J. Iskandar, A. Rahim, and I. S. Rahmaini, “Developing student social attitude self-assessment instruments: A study in vocational high school,†REID (Research and Evaluation in Education), vol. 8, no. 1, pp. 1-12, 2022, https://doi.org/10.21831/reid.v8i1.45100.
[112] A. Setiawan, W. Cendana, M. Ayres, A. A. Yuldashev, and S. P. Setyawati, “Development and validation of a self-assessment-based instrument to measure elementary school students’ attitudes in online learning,†REID (Research and Evaluation in Education), vol. 9, no. 2, pp. 184-197, 2023, https://doi.org/10.21831/reid.v9i2.52083.
[113] J. Subando, M. K. B. Wibowo, and F. Farkhani, “The development of measurement instruments of Sharia students’ perceptions about Khilafah,†REID (Research and Evaluation in Education), vol. 9, no. 2, pp. 141-155, 2023, https://doi.org/10.21831/reid.v9i2.63966.
[114] S. Sumin, F. Sukmawati, and N. Nurdin, “Gender differential item functioning on the Kentucky Inventory of Mindfulness Skills instrument using logistic regression,†REID (Research and Evaluation in Education), vol. 8, no. 1, pp. 55-66, 2022, https://doi.org/10.21831/reid.v8i1.50809.
[115] Y. P. Susani, G. R. Rahayu, Y. S. Prabandari, R. Sanusi, and H. Mardiwiyoto, “Developing an instrument to measure student’s perception of the medical education curriculum from the perspective of Communities of Practice theory,†REID (Research and Evaluation in Education), vol. 6, no. 2, pp. 109-118, 2020, https://doi.org/10.21831/reid.v6i2.31500.
[116] F. Falcão, D. M. Pereira, N. Gonçalves, A. De Champlain, P. Costa, and J. M. Pêgo, “A suggestive approach for assessing item quality, usability and validity of automatic item generation,†Advances in Health Sciences Education, vol. 28, no. 5, pp. 1441-1465, 2023, https://doi.org/10.1007/s10459-023-10225-y.
[117] M. Boussakuk, A. Bouchboua, M. El Ghazi, M. El Bekkali, and M. Fattah, “Design of computerized adaptive testing module into our dynamic adaptive hypermedia system,†International Journal of Emerging Technologies in Learning (iJET), vol. 16, no. 18, pp. 113-128, 2021, https://doi.org/10.3991/ijet.v16i18.23841.
[118] J. Renes, C. P. M. Van Der Vleuten, and C. F. Collares, “Utility of a multimodal computer-based assessment format for assessment with a higher degree of reliability and validity,†Medical Teacher, vol. 45, no. 4, pp. 433-441, 2023, https://doi.org/10.1080/0142159X.2022.2137011.
[119] D. G. H. Divayana, I. G. Sudirtha, and I. K. Suartama, “Digital test instruments based on wondershare-superitem for supporting distance learning implementation of assessment course,†International Journal of Instruction, vol. 14, no. 4, pp. 945-964, 2021, https://doi.org/10.29333/iji.2021.14454a.
[120] M. Zagaar and W. Chen, “Assessment for deeper understanding using concept maps: Lessons learned from flipped teaching of pharmacology,†Medical Science Educator, vol. 32, no. 6, pp. 1289-1297, 2022, https://doi.org/10.1007/s40670-022-01653-3.
[121] M. Reina et al., “PLATA: Design of an online platform for chemistry undergraduate fully automated assignments,†Journal of Chemical Education, vol. 101, no. 3, pp. 1024–1035, 2024, https://doi.org/10.1021/acs.jchemed.3c00962.
[122] Y. Karay, B. Reiss, and S. K. Schauber, “Progress testing anytime and anywhere – Does a mobile-learning approach enhance the utility of a large-scale formative assessment tool?,†Medical Teacher, vol. 42, no. 10, pp. 1154-1162, 2020, https://doi.org/10.1080/0142159X.2020.1798910.
[123] J. Fuentes-Cimma et al., “Utility analysis of an adapted Mini-CEX WebApp for clinical practice assessment in physiotherapy undergraduate students,†Frontiers in Education, vol. 8, pp. 1-10, 2023, https://doi.org/10.3389/feduc.2023.943709.
[124] C. Eitemüller, F. Trauten, M. Striewe, and M. Walpuski, “Digitalization of multistep chemistry exercises with automated formative feedback,†Journal of Science Education and Technology, vol. 32, no. 3, pp. 453-467, 2023, https://doi.org/10.1007/s10956-023-10043-2.
[125] C.-Y. Chou and N.-B. Zou, “An analysis of internal and external feedback in self-regulated learning activities mediated by self-regulated learning tools and open learner models,†International Journal of Educational Technology in Higher Education, vol. 17, no. 1, pp. 1-27, 2020, https://doi.org/10.1186/s41239-020-00233-y.
[126] B. Seipel, P. C. Kennedy, S. E. Carlson, V. Clinton-Lisell, and M. L. Davison, “MOCCA-College: Preliminary Validity Evidence of a Cognitive Diagnostic Reading Comprehension Assessment,†Journal of Learning Disabilities, vol. 56, no. 1, pp. 58-71, 2023, https://doi.org/10.1177/00222194221121340.
[127] W. Yuan et al., “Improving the resident assessment process: application of App-based e-training platform and lean thinking,†BMC Medical Education, vol. 23, no. 1, pp. 1-9, 2023, https://doi.org/10.1186/s12909-023-04118-2.
[128] E. Rowe et al., “Interactive Assessments of CT (IACT): Digital Interactive Logic Puzzles to Assess Computational Thinking in Grades 3-8,†International Journal of Computer Science Education in Schools, vol. 5, no. 2, pp. 28-73, 2021, https://doi.org/10.21585/ijcses.v5i1.149.
[129] D. Guzmanâ€Orth, Y. Song, and J. R. Sparks, “Designing accessible formative assessment tasks to measure argumentation skills for english learners,†ETS Research Report Series, vol. 2019, no. 1, pp. 1–15, 2019, https://doi.org/10.1002/ets2.12251.
[130] M. H. Dlab, S. Candrlic, and M. Pavlic, “Formative assessment activities to advance education: A case study,†Journal of Information Technology Education: Innovations in Practice, vol. 20, pp. 37–57, 2021, https://doi.org/10.28945/4758.
[131] S. Seifert and L. Paleczek, “Digitally assessing text comprehension in grades 3-4: Test development and validation,†Electronic Journal of e-Learning, vol. 19, no. 5, pp. 336-348, 2021, https://doi.org/10.34190/ejel.19.5.2467.
[132] C. Isler and B. Aydin, “Developing and validating a computerized oral proficiency test of english as a foreign language (COPTEFL),†International Journal of Assessment Tools in Education, vol. 8, no. 1, pp. 38-66, 2021, https://doi.org/10.21449/ijate.854678.
[133] A. Suryadi and S. Kusairi, “Developing computer-assisted formative feedback in the light of resource theory: A case on heat concept,†Journal of Technology and Science Education, vol. 11, no. 2, pp. 343–356, 2021, https://doi.org/10.3926/jotse.1100.
[134] F. Molin, C. Haelermans, S. Cabus, and W. Groot, “Do feedback strategies improve students’ learning gain?-Results of a randomized experiment using polling technology in physics classrooms,†Computers & Education, vol. 175, p. 104339, 2021, https://doi.org/10.1016/j.compedu.2021.104339.
[135] M. B. Ada, “Evaluation of a mobile web application for assessment feedback,†Technology, Knowledge and Learning, vol. 28, pp. 23-46, 2023, https://doi.org/10.1007/s10758-021-09575-6.
[136] S. H. P. W. Gamage, J. R. Ayres, M. B. Behrend, and E. J. Smith, “Optimising Moodle quizzes for online assessments,†International Journal of STEM Education, vol. 6, no. 1, p. 27, 2019, https://doi.org/10.1186/s40594-019-0181-4.
[137] I. López-Tocón, “Moodle quizzes as a continuous assessment in higher education: An exploratory approach in physical chemistry,†Education Sciences, vol. 11, pp. 1–12, 2021, https://doi.org/10.3390/educsci11090500.
[138] S. W. Widyaningsih, I. Yusuf, Z. K. Prasetyo, and E. Istiyono, “The development of the HOTS test of physics based on modern test theory: Question modeling through e-learning of Moodle LMS,†International Journal of Instruction, vol. 14, no. 4, pp. 51–68, 2021, https://doi.org/10.29333/iji.2021.1444a.
[139] A. Barana and M. M. Conte, “Promoting socioeconomic equity through automatic formative assessment,†Journal on Mathematics Education, vol. 15, no. 1, pp. 227–252, 2024, https://doi.org/10.22342/jme.v15i1.pp227-252.
[140] B. Duffy, R. Tully, and A. V. Stanton, “An online case-based teaching and assessment program on clinical history-taking skills and reasoning using simulated patients in response to the COVID-19 pandemic,†BMC Medical Education, vol. 23, no. 1, p. 4, 2023, https://doi.org/10.1186/s12909-022-03950-2.
[141] K. Khalaf, M. El-Kishawi, M. A. Moufti, and S. Al Kawas, “Introducing a comprehensive high-stake online exam to final-year dental students during the COVID-19 pandemic and evaluation of its effectiveness,†Medical Education Online, vol. 25, no. 1, pp. 1–10, 2020, https://doi.org/10.1080/10872981.2020.1826861.
[142] N. Mdlalose, S. Ramaila, and U. Ramnarain, “Using Kahoot! as a formative assessment tool in science teacher education,†International Journal of Higher Education, vol. 11, no. 2, pp. 43–51, 2022, https://doi.org/10.5430/ijhe.v11n2p43.
[143] K. N. A. Al-Mwzaiji and A. A. F. Alzubi, “Online self-evaluation: the EFL writing skills in focus,†Asian-Pacific Journal of Second and Foreign Language Education, vol. 7, no. 1, pp. 1–16, 2022, https://doi.org/10.1186/s40862-022-00135-8.
[144] W. J. A. J. Hendriks et al., “Certainty-based marking in a formative assessment improves student course appreciation but not summative examination scores,†BMC Medical Education, vol. 19, no. 1, pp. 1–11, 2019, https://doi.org/10.1186/s12909-019-1610-2.
[145] M. Muftah, F. A. Y. Al-Inbari, B. Q. Al-Wasy, and H. S. Mahdi, “The role of automated corrective feedback in improving EFL learners’ mastery of the writing aspects,†Psycholinguistics, vol. 34, no. 2, pp. 82-109, 2023, https://doi.org/10.31470/2309-1797-2023-34-2-82-109.
[146] Y. Qian and J. D. Lehman, “Using targeted feedback to address common student misconceptions in introductory programming: A data-driven approach,†SAGE Open, vol. 9, no. 4, pp. 1–20, 2019, https://doi.org/10.1177/2158244019885136.
[147] M. F. Areed, M. A. Amasha, R. A. Abougalala, S. Alkhalaf, and D. Khairy, “Developing gamification e-quizzes based on an android app: The impact of asynchronous form,†Education and Information Technologies, vol. 26, no. 4, pp. 4857–4878, 2021, https://doi.org/10.1007/s10639-021-10469-4.
[148] U. Schepke, M. E. Van Wulfften Palthe, E. W. Meisberger, W. Kerdijk, M. S. Cune, and B. Blok, “Digital assessment of a retentive full crown preparation—An evaluation of prepCheck in an undergraduate preâ€clinical teaching environment,†European Journal of Dental Education, vol. 24, no. 3, pp. 407–424, 2020, https://doi.org/10.1111/eje.12516.
[149] A. Mizumoto, Y. Sasao, and S. A. Webb, “Developing and evaluating a computerized adaptive testing version of the Word Part Levels Test,†Language Testing, vol. 36, no. 1, pp. 101–123, 2019, https://doi.org/10.1177/0265532217725776.
[150] W.-H. Chuo et al., “Evaluate the feasibility of the implementation of e-assessment in objective structured clinical examination (OSCE) in pharmacy education from the examiner’s perspectives,†Education Sciences, vol. 11, pp. 1–14, 2021, https://doi.org/10.3390/educsci11050194.
[151] H. Çeliktas and R. E. Demirbatir, “Effect of online quizzes on music theory achievement of freshman music teaching students,†Journal of Education and Learning (EduLearn), vol. 16, no. 1, pp. 130–136, 2022, https://doi.org/10.11591/edulearn.v16i1.20379.
[152] Z. Ç. Köroglu, “Using digital formative assessment to evaluate EFL learners’ english speaking skills,†GIST Education and Learning Research Journal, vol. 22, pp. 103–123, 2021, https://doi.org/10.26817/16925777.1001.
[153] A. A. Lopez, D. Guzmanâ€Orth, D. Zapataâ€Rivera, C. M. Forsyth, and C. Luce, “Examining the accuracy of a conversationâ€based assessment in interpreting english learners’ written responses,†ETS Research Report Series, vol. 2021, no. 1, pp. 1–15, 2021, https://doi.org/10.1002/ets2.12315.
[154] T. de Lange, A. Møystad, and G. Torgersen, “How can videoâ€based assignments integrate practical and conceptual knowledge in summative assessment? Student experiences from a longitudinal experiment,†British Educational Research Journal, vol. 46, no. 6, pp. 1279–1299, 2020, https://doi.org/10.1002/berj.3632.
[155] A. P. Goodwin et al., “Monster, P.I.: Validation evidence for an assessment of adolescent language that assesses vocabulary knowledge, morphological knowledge, and syntactical awareness,†Assessment for Effective Intervention, vol. 47, no. 2, pp. 89–100, 2022, https://doi.org/10.1177/1534508420966383.
[156] E. Istiyono, W. S. B. Dwandaru, R. Setiawan, and I. Megawati, “Developing of computerized adaptive testing to measure physics higher order thinking skills of senior high school students and its feasibility of use,†European Journal of Educational Research, vol. 9, no. 1, pp. 91–101, 2020, https://doi.org/10.12973/eu-jer.9.1.91.
[157] S. Hopkins and R. O’Donovan, “Developing assessments for students with intellectual disability to support differentiation,†Mathematics Teacher Education and Development, vol. 23, no. 3, pp. 132–147, 2021, http://orcid.org/0000-0002-5826-0193.
[158] C. Conner, A. R. Henry, E. J. Solari, and M. C. Zajic, “Conducting oral and written language adapted tele-assessments with early elementary-age children with autism spectrum disorder,†Autism & Developmental Language Impairments, vol. 7, pp. 1–15, 2022, https://doi.org/10.1177/23969415221133268.
[159] C.-A. Lee, N.-F. Huang, J.-W. Tzeng, and P.-H. Tsai, “AI-based diagnostic assessment system: Integrated with knowledge map in MOOCs,†IEEE Transactions on Learning Technologies, vol. 16, no. 5, pp. 873–886, 2023, https://doi.org/10.1109/TLT.2023.3308338.
[160] S. Radović, N. Seidel, J. M. Haake, and R. Kasakowskij, “Analysing students’ selfâ€assessment practice in a distance education environment: Student behaviour, accuracy, and taskâ€related characteristics,†Journal of Computer Assisted Learning, vol. 40, no. 2, pp. 654–666, 2024, https://doi.org/10.1111/jcal.12907.
[161] M. K. Wolf and A. A. Lopez, “Developing a technology-based classroom assessment of academic reading skills for english language learners and teachers: Validity evidence for formative use,†Languages, vol. 7, no. 2, pp. 1–22, 2022, https://doi.org/10.3390/languages7020071.
[162] J. R. Rico-Juan, V. M. Sánchez-Cartagena, J. J. Valero-Mas, and A. J. Gallego, “Identifying student profiles within online judge systems using explainable artificial intelligence,†IEEE Transactions on Learning Technologies, vol. 16, no. 6, pp. 955–969, 2023, https://doi.org/10.1109/TLT.2023.3239110.
[163] D. Di Mitri, J. Schneider, and H. Drachsler, “Keep me in the loop: Real-time feedback with multimodal data,†International Journal of Artificial Intelligence in Education, vol. 32, no. 4, pp. 1093–1118, 2022, https://doi.org/10.1007/s40593-021-00281-z.
[164] J. P. Bernius, S. Krusche, and B. Bruegge, “Machine learning based feedback on textual student answers in large courses,†Computers and Education: Artificial Intelligence, vol. 3, pp. 1–16, 2022, https://doi.org/10.1016/j.caeai.2022.100081.
[165] M. Almasre, “Development and evaluation of a custom GPT for the assessment of students’ designs in a typography course,†Education Sciences, vol. 14, no. 2, pp. 1–19, 2024, https://doi.org/10.3390/educsci14020148.
[166] A. Darvishi, H. Khosravi, S. Sadiq, and D. Gašević, “Incorporating AI and learning analytics to build trustworthy peer assessment systems,†British Journal of Educational Technology, vol. 53, no. 4, pp. 844–875, 2022, https://doi.org/10.1111/bjet.13233.
[167] L. Kaldaras, N. R. Yoshida, and K. C. Haudek, “Rubric development for AI-enabled scoring of three-dimensional constructed-response assessment aligned to NGSS learning progression,†Frontiers in Education, vol. 7, p. 983055, 2022, https://doi.org/10.3389/feduc.2022.983055.
[168] M. T. Nagy and E. Korom, “Measuring scientific reasoning of fourth graders: Validation of the Science-K inventory in paper-based and computer-based testing environments,†Journal of Baltic Science Education, vol. 22, no. 6, pp. 1050–1062, 2023, https://doi.org/10.33225/jbse/23.22.1050.
[169] T. Simon, I. Biró, and A. Kárpáti, “Developmental assessment of visual communication skills in primary education,†Journal of Intelligence, vol. 10, pp. 1–20, 2022, https://doi.org/10.3390/jintelligence10030045.
[170] S. Varga, A. Pásztor, and J. Stekács, “Online assessment of morphological awareness in grades 2-4: Its development and relation to reading comprehension,†Journal of Intelligence, vol. 10, no. 47, pp. 1–19, 2022, https://doi.org/10.3390/jintelligence10030047.
[171] K.-C. Li, M. Chang, and K.-H. Wu, “Developing a task-based dialogue system for english language learning,†Education Sciences, vol. 10, no. 11, pp. 1–20, 2020, https://doi.org/10.3390/educsci10110306.
[172] A. L. C. Barczak, A. Mathrani, B. Han, and N. H. Reyes, “Automated assessment system for programming courses: A case study for teaching data structures and algorithms,†Educational technology research and development, vol. 71, no. 6, pp. 2365–2388, 2023, https://doi.org/10.1007/s11423-023-10277-2.
[173] L. Zheng, M. Long, B. Chen, and Y. Fan, “Promoting knowledge elaboration, socially shared regulation, and group performance in collaborative learning: an automated assessment and feedback approach based on knowledge graphs,†International Journal of Educational Technology in Higher Education, vol. 20, no. 1, pp. 1–20, 2023, https://doi.org/10.1186/s41239-023-00415-4.
[174] S. Kusairi, D. Anggita Puspita, A. Suryadi, and H. Suwono, “Physics formative feedback game: Utilization of isomorphic multiple-choice items to help students learn kinematics,†TEM Journal, vol. 9, no. 4, pp. 1625–1632, 2020, https://doi.org/10.18421/TEM94-39.
[175] X. Lv, L. Li, L. Guo, T. He, and S. Liu, “Game-based formative assessment of analogical reasoning in preschool children: Support from the internet of things technology,†Sustainability, vol. 14, no. 21, pp. 1–17, 2022, https://doi.org/10.3390/su142113830.
[176] E. Rushton and S. Corrigan, “Game-assisted assessment for broader adoption: Participatory design and game-based scaffolding,†Electronic Journal of e-Learning, vol. 19, no. 2, pp. 71–87, 2021, https://doi.org/10.34190/ejel.19.2.2143.
[177] A. J. King, J. M. Kahn, E. B. Brant, G. F. Cooper, and D. L. Mowery, “Initial development of an automated platform for assessing trainee performance on case presentations,†ATS Scholar, vol. 3, no. 4, pp. 548–560, 2022, https://doi.org/10.34197/ats-scholar.2022-0010OC.
[178] E. Rusman and R. Nadolski, “Pe(e)rfectly skilled: Underpinnings of an online formative assessment method for (inter)active and practice-based complex skills training in higher education (HE),†International Journal of Mobile and Blended Learning, vol. 15, no. 2, pp. 1–14, 2023, https://doi.org/10.4018/ijmbl.318646.
[179] R. J. Nadolski, H. G. K. Hummel, E. Rusman, and K. Ackermans, “Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education,†Educational Technology Research and Development, vol. 69, no. 5, pp. 2663–2682, 2021, https://doi.org/10.1007/s11423-021-10030-7.
[180] S. van Ginkel et al., “Fostering oral presentation competence through a virtual reality-based task for delivering feedback,†Computers & Education, vol. 134, pp. 78–97, 2019, https://doi.org/10.1016/j.compedu.2019.02.006.
[181] W. A. P. Udeshinee, O. Knutsson, and S. Männikkö-Barbutiu, “Text chat-mediated dynamic assessment towards self-regulation in language learning,†International Journal of Mobile and Blended Learning, vol. 16, no. 1, pp. 1–21, 2023, https://doi.org/10.4018/ijmbl.335067.
[182] A. Sayin, S. Bozdag, and M. J. Gierl, “Automatic item generation for non-verbal reasoning items,†International Journal of Assessment Tools in Education, vol. 10, pp. 131–147, 2023, https://doi.org/10.21449/ijate.1359348.
[183] A. Sayin and M. J. Gierl, “Automatic item generation for online measurement and evaluation: Turkish literature items,†International Journal of Assessment Tools in Education, vol. 10, no. 2, pp. 218–231, 2023, https://doi.org/10.21449/ijate.1249297.
[184] I. Uysal and N. Dogan, “Automated essay scoring effect on test equating errors in mixed-format test,†International Journal of Assessment Tools in Education, vol. 8, no. 2, pp. 222–238, 2021, https://doi.org/10.21449/ijate.815961.
[185] M. Park, “Effects of simulation-based formative assessments on students’ conceptions in physics,†Eurasia Journal of Mathematics, Science and Technology Education, vol. 15, no. 7, pp. 1–18, 2019, https://doi.org/10.29333/ejmste/103586.
[186] A. Mizumoto and M. Eguchi, “Exploring the potential of using an AI language model for automated essay scoring,†Research Methods in Applied Linguistics, vol. 2, no. 2, p. 100050, 2023, https://doi.org/10.1016/j.rmal.2023.100050.
[187] S. H. Mercer, J. E. Cannon, B. Squires, Y. Guo, and E. Pinco, “Accuracy of automated written expression curriculum-based measurement scoring,†Canadian Journal of School Psychology, vol. 36, no. 4, pp. 304–317, 2021, https://doi.org/10.1177/0829573520987753.
[188] O. T. Aricak, A. Avcu, F. Topçu, and M. G. Tutlu, “Use of item response theory to validate cyberbullying sensibility scale for university students,†International Journal of Assessment Tools in Education, vol. 7, no. 1, pp. 18–29, 2020, https://doi.org/10.21449/ijate.629584.
[189] N. Cikrikci et al., “Development of a computerized adaptive version of the Turkish driving licence exam,†International Journal of Assessment Tools in Education, vol. 7, no. 4, pp. 570–587, 2020, https://doi.org/10.21449/ijate.716177.
[190] M. Koch, F. M. Spinath, S. Greiff, and N. Becker, “Development and validation of the open matrices item bank,†Journal of Intelligence, vol. 10, pp. 1–10, 2022, https://doi.org/10.3390/jintelligence10030041.
[191] A. Mohammed, A. K. S. Dawood, T. Alghazali, Q. K. Kadhim, A. A. Sabti, and S. H. Sabit, “A cognitive diagnostic assessment study of the reading comprehension section of the preliminary english test (PET),†International Journal of Language Testing, vol. 13, pp. 1-20, 2023, https://doi.org/10.22034/ijlt.2022.362849.1195.
[192] J. Seifried, S. Brandt, K. Kögler, and A. Rausch, “The computer-based assessment of domain-specific problem-solving competence—A three-step scoring procedure,†Cogent Education, vol. 7, no. 1, p. 1719571, 2020, https://doi.org/10.1080/2331186X.2020.1719571.
[193] M.-J. Agost, P. Company, M. Contero, and J. D. Camba, “CAD training for digital product quality: A formative approach with computer-based adaptable resources for self-assessment,†International Journal of Technology and Design Education, vol. 32, no. 2, pp. 1393–1411, 2022, https://doi.org/10.1007/s10798-020-09651-5.
[194] U. C. K. Al Muniandy, H. Zulnaidi, and S. H. Halili, “Validity and reliability of the situational motivational scale (SIMS) instrument: Using Rasch model and AMOS,†Malaysian Online Journal of Educational Sciences, vol. 11, no. 1, pp. 34-46, 2023, https://ajba.um.edu.my/index.php/MOJES/article/view/41265/15463.
[195] M. Saefi et al., “Validating of knowledge, attitudes, and practices questionnaire for prevention of COVID-19 infections among undergraduate students: A Rasch and factor analysis,†Eurasia Journal of Mathematics, Science and Technology Education, vol. 16, no. 12, pp. 1-14, 2020, https://doi.org/10.29333/ejmste/9352.
[196] M. A. Samsudin, T. S. Chut, M. E. Ismail, and N. J. Ahmad, “A calibrated item bank for computerized adaptive testing in measuring science TIMSS performance,†Eurasia Journal of Mathematics, Science and Technology Education, vol. 16, no. 7, pp. 1–15, 2020, https://doi.org/10.29333/ejmste/8259.
[197] S. Soeharto, “Development of a diagnostic assessment test to evaluate science misconceptions in terms of school grades: A Rasch measurement aproach,†Journal of Turkish Science Education, vol. 18, no. 3, pp. 351-370, 2021, https://doi.org/10.36681/tused.2021.78.
[198] D. Y. Ergül and M. F. Tasar, “Development and validation of the teachers’ digital competence scale (TDiCoS),†Journal of Learning and Teaching in Digital Age, vol. 8, no. 1, pp. 148–160, 2023, https://doi.org/10.53850/joltida.1204358.
[199] A. S. Konca, O. Baltaci, and O. F. Akbulut, “Problematic technology use scale for young children (PTUS-YC): Validity and reliability study,†International Journal of Assessment Tools in Education, vol. 9, no. 2, pp. 267-289, 2022, https://doi.org/10.21449/ijate.888936.
[200] J. D. G. Quinto, “Development and validation of survey instrument on game-based learning approach (SIGBLA),†International Journal of Emerging Technologies in Learning (iJET), vol. 17, no. 15, pp. 233–242, 2022, https://doi.org/10.3991/ijet.v17i15.33267.
[201] S. Sovey, K. Osman, and M. E. E. Mohd-Matore, “Exploratory and confirmatory factor analysis for disposition levels of computational thinking instrument among secondary school students,†European Journal of Educational Research, vol. 11, no. 2, pp. 639-652, 2022, https://doi.org/10.12973/eu-jer.11.2.639.
[202] M. P. Virtic, A. D. Plessis, and A. Šorgo, “Development and validation of the ‘mentoring for effective teaching practicum instrument,†Center for Educational Policy Studies Journal, vol. 13, no. 3, pp. 233–260, 2023, https://doi.org/10.26529/cepsj.1315.
[203] A. Rachmatullah et al., “Development and validation of the middle grades computer science concept inventory (MG-CSCI) assessment,†Eurasia Journal of Mathematics, Science and Technology Education, vol. 16, no. 5, pp. 1-24, 2020, https://doi.org/10.29333/ejmste/116600.
[204] M. Polat, “Comparison of performance measures obtained from foreign language tests according to item response theory vs classical test theory,†International Online Journal of Education and Teaching, vol. 9, no. 1, pp. 471-485, 2022, https://iojet.org/index.php/IOJET/article/view/1583.
[205] C. Eckerly, Y. Jia, and P. Jewsbury, “Technology-enhanced items and model – data misfit (Research Report No. RR-22-11),†ETS Research Report Series, vol. 22, no. 1, pp. 1–16, 2022, https://doi.org/10.1002/ets2.12353.
[206] L. Viskotová and D. Hampel, “Increasing the efficiency of teacher’s work: The case of undergraduate mathematics mid-term assessment,†Mathematics Teaching Research Journal, vol. 14, no. 2, pp. 88–104, 2022, https://files.eric.ed.gov/fulltext/EJ1350208.pdf.
[207] P. Gawliczek, V. Krykun, N. Tarasenko, M. Tyshchenko, and O. Shapran, “Computer adaptive language testing according to NATO STANAG 6001 requirements,†Advanced Education, vol. 17, pp. 19–26, 2021, https://doi.org/10.20535/2410-8286.225018.
[208] L. Kuklick and M. A. Lindner, “Affective-motivational effects of performance feedback in computer-based assessment: Does error message complexity matter?,†Contemporary Educational Psychology, vol. 73, pp. 1–14, 2023, https://doi.org/10.1016/j.cedpsych.2022.102146.
[209] K. A. Kroeze, S. M. Van Den Berg, B. P. Veldkamp, and T. De Jong, “Automated assessment of and feedback on concept maps during inquiry learning,†IEEE Transactions on Learning Technologies, vol. 14, no. 4, pp. 460-473, 2021, https://doi.org/10.1109/TLT.2021.3103331.
[210] T.-C. Hsu, Y.-S. Chang, M.-S. Chen, I.-F. Tsai, and C.-Y. Yu, “A validity and reliability study of the formative model for the indicators of STEAM education creations,†Education and Information Technologies, vol. 28, no. 7, pp. 8855–8878, 2023, https://doi.org/10.1007/s10639-022-11412-x.
[211] R. P. Chalmers, “mirt: A multidimensional item response theory package for the R environment,†Journal of Statistical Software, vol. 48, no. 6, pp. 1–29, 2012, https://doi.org/10.18637/jss.v048.i06.
[212] B. Boitshwarelo, A. K. Reedy, and T. Billany, “Envisioning the use of online tests in assessing twenty-first century learning: A literature review,†Research and Practice in Technology Enhanced Learning, vol. 12, no. 1, p. 16, 2017, https://doi.org/10.1186/s41039-017-0055-7.
[213] A. Surya and A. Aman, “Developing formative authentic assessment instruments based on learning trajectory for elementary school,†REID (Research and Evaluation in Education), vol. 2, no. 1, pp. 13–24, 2016, https://doi.org/10.21831/reid.v2i1.6540.
[214] T. K. Gunning et al., “Who engaged in the team-based assessment? Leveraging EdTech for a self and intra-team peer-assessment solution to free-riding,†International Journal of Educational Technology in Higher Education, vol. 19, no. 1, pp. 1–22, 2022, https://doi.org/10.1186/s41239-022-00340-y.
[215] P. Karaman, “The effect of formative assessment practices on student learning: A meta-analysis study,†International Journal of Assessment Tools in Education, vol. 8, no. 4, pp. 801–817, 2021, https://doi.org/10.21449/ijate.870300.
[216] O. Ndayizeye, “Discrepancies in assessing undergraduates’ pragmatics learning,†REID (Research and Evaluation in Education), vol. 3, no. 2, pp. 133–143, 2017, https://doi.org/10.21831/reid.v3i2.14487.
[217] J. D. Elicker, N. L. McConnell, “Interactive learning in the classroom: Is student response method related to performance?,†Teaching of Psychology, vol. 38, no. 3, pp. 147-150, 2011, https://doi.org/10.1177/0098628311411789.
[218] J. Willis and V. Klenowski, “Classroom assessment practices and teacher learning: An Australian perspective,†Teacher learning with classroom assessment, pp. 19-37, 2018, https://doi.org/10.1007/978-981-10-9053-0_2.
[219] J. Moss, S. Godinho, and E. Chao, “Enacting the Australian Curriculum: Primary and secondary teachers’ approaches to integrating the curriculum,†Australian Journal of Teacher Education, vol. 44, no. 3, pp. 24–41, 2019, https://doi.org/10.14221/ajte.2018v44n3.2.
[220] F. M. van der Kleij, J. J. Cumming, and A. Looney, “Policy expectations and support for teacher formative assessment in Australian education reform,†Assessment in Education: Principles, Policy & Practice, vol. 25, no. 6, pp. 620–637, 2018, https://doi.org/10.1080/0969594X.2017.1374924.
[221] I. T. C. A. T. Publishers, “Guidelines for technology-based assessment,†International Test Commission and the Association of Test Publishers, 2022, https://www.intestcom.org/upload/media-library/tba-guidelines-final-2-23-2023-v4-167785144642TgY.pdf
[222] U. Bezirhan and M. von Davier, “Automated reading passage generation with OpenAI’s large language model,†Computers and Education: Artificial Intelligence, vol. 5, pp. 1-13, 2023, https://doi.org/10.1016/j.caeai.2023.100161.
[223] O. Bulut et al., “The rise of artificial intelligence in educational measurement: Opportunities and ethical challenges,†Computers and Society, 2024, https://doi.org/10.48550/arXiv.2406.18900.
[224] A. Khademi, “Can ChatGPT and Bard generate aligned assessment items? A reliability analysis against human performance,†Journal of Applied Learning & Teaching, vol. 6, no. 1, pp. 75–80, 2023, https://doi.org/10.37074/jalt.2023.6.1.28.
[225] J. Rudolph, S. Tan, and S. Tan, “ChatGPT: Bullshit spewer or the end of traditional assessments in higher education?,†Journal of Applied Learning & Teaching, vol. 6, no. 1, pp. 342–263, 2023, https://doi.org/10.37074/jalt.2023.6.1.9.
[226] H. Zhang, H. Song, S. Li, M. Zhou, and D. Song, “A survey of controllable text generation using transformer-based pre-trained language models,†ACM Computing Surveys, vol. 56, no. 3, pp. 1–37, 2024, https://doi.org/10.1145/3617680.
[227] I. O. Gallegos et al., “Bias and fairness in large language models: A survey,†Computation and Language, 2023, https://doi.org/10.48550/arXiv.2309.00770.


Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Heri Retnawati, Elena Kardanova, Sumaryanto Sumaryanto, Lantip Diat Prasojo, Jailani Jailani, Elly Arliani, Kana Hidayati, Mathilda Susanti, Ezi Apino, Ibnu Rafi, Munaya Nikma Rosyada, Rugaya Tuanaya

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

 


About the JournalJournal PoliciesAuthor Information

International Journal of Robotics and Control Systems
e-ISSN: 2775-2658
Website: https://pubs2.ascee.org/index.php/IJRCS
Email: ijrcs@ascee.org
Organized by: Association for Scientific Computing Electronics and Engineering (ASCEE)Peneliti Teknologi Teknik IndonesiaDepartment of Electrical Engineering, Universitas Ahmad Dahlan and Kuliah Teknik Elektro
Published by: Association for Scientific Computing Electronics and Engineering (ASCEE)
Office: Jalan Janti, Karangjambe 130B, Banguntapan, Bantul, Daerah Istimewa Yogyakarta, Indonesia