International Journal of Science and Engineering
8:10:45
 
 
More....

International Journal of Science and EngineeringJuly-December 2025 Vol:4 Issue:2

The Read-Reflect-Respond (R³) Framework: Designing AI-resilient Assignments to Promote Authentic Learning and Academic Integrity

Abstract

The emergence of AI in education has created both opportunities and challenges, especially in students’ examinations and assessments. Today, AI can solve almost any problem in seconds and provide answers in any style. This is useful in education, but its lead to misuse of AI in assignments and examinations, where students can solve the questions through AI, without independently thinking about the questions. AI provides simpler solutions, according to the prompt and can mimic human-like responses. Using AI to solve assignment questions has posed a challenge to the development of creative and critical thinking. Recently, students are directly copying AI-generated texts and pasting or writing in their answer sheets. Although AI has the potential to solve any problem, it has become a challenge for educators to evaluate ethically. There are several tools available, like plagiarism detection tools and AI-content detectors. This paper proposes the idea of AI-resilient assignments maintaining academic integrity with product- and process-based evaluation approach. These AI-resilient assignments are similar to the open-book system, where they require human intelligence, independent thinking, and personal understanding to solve them correctly.

Author

Atul Sahu1, Chandrakant Kumar Singh2*, A. K. Malik3   ( Pages 97-111 )
Email:cksingh@uprtou.ac.in
Affiliation:School of Sciences, UP Rajarshi Tandon Open University Prayagraj, UP, India      DOI: https://doi.org/10.58517/IJSE.2025.04202

Keyword

AI-resilient assignments, Sustainable assessment framework, Educational assessment, AI-resilience, Ethical AI in education.

References

1.      Adiguzel, T., Kaya, M. H., & Cansu, F. K. (2023). Revolutionizing education with AI: Exploring the transformative potential of ChatGPT. Contemporary Educational Technology, 15(3), Article 13152. https://doi.org/10.30935/CEDTECH/13152

2.      Awadallah Alkouk, W., & Khlaif, Z. N. (2024). AI-resistant assessments in higher education: Practical insights from faculty training workshops. Frontiers in Education, 9, Article 1499495. https://doi.org/10.3389/feduc.2024.1499495

3.      Aziz, M. N. A., Yusoff, N. M., & Yaakob, M. F. M. (2020). Challenges in using authentic assessment in 21st century ESL classrooms. International Journal of Evaluation and Research in Education, 9(3), 759–768. https://doi.org/10.11591/ijere.v9i3.20546

4.      Bansal, D. (2022). Open book examinations: Modifying pedagogical practices for effective teaching and learning. The Law Teacher, 56(3), 354–367. https://doi.org/10.1080/03069400.2021.1999151

5.      Botuzova, Y., Iievliev, O., Okipniak, I., Yandola, K., & Charkina, T. (2023). Innovative approaches to assessment in pedagogical practice. Cadernos de Educação Tecnologia e Sociedade, 16(2), 386–398. https://doi.org/10.14571/brajets.v16.n2.386-398

6.      Dawson, P., Nicola-Richmond, K., & Partridge, H. (2024). Beyond open book versus closed book: A taxonomy of restrictions in online examinations. Assessment & Evaluation in Higher Education, 49(2), 262–274. https://doi.org/10.1080/02602938.2023.2209298

7.      Elstad, E., & Eriksen, H. (2025). Antecedents of Norwegian high school teachers’ AI resistance. Journal of Teacher Education and Educators, 14(2), 181–197.

8.      Gill-Simmen, L., & Tsaousi, C. (2025). Navigating resistance: Understanding student reluctance to experiment with AI in higher education. In EDEN proceedings (pp. 159–161). https://pure.royalholloway.ac.uk/en/publications/navigating-resistance-understanding-student-reluctance-to-experim/

9.      Kalmus, J.-E., & Nikiforova, A. (2024). To accept or not to accept? An IRT-TOE framework to understand educators’ resistance to generative AI in higher education. Zenodo. https://doi.org/10.5281/zenodo.13122996

10.   Prasad, K. M. V. V., & Aluvalu, R. (2017). Benefits and challenges of open book examination as assessment model for engineering courses. Journal of Engineering Education Transformations, 2394–1707. https://doi.org/10.16920/jeet/2017/v0i0/111682

11.   Rehman, J., Ali, R., Afzal, A., Shakil, S., Sultan, A. S., Idrees, R., & Fatima, S. S. (2022). Assessment during COVID-19: Quality assurance of an online open book formative examination for undergraduate medical students. BMC Medical Education, 22, Article 792. https://doi.org/10.1186/s12909-022-03849-y

12.   Shrivastava, P. (2025). Understanding acceptance and resistance toward generative AI technologies: A multi-theoretical framework integrating functional, risk, and sociolegal factors. Frontiers in Artificial Intelligence, 8, 1565927. https://doi.org/10.3389/frai.2025.1565927

13.   Teo, T. H., & Chew, D. (2025). A Comprehensive Review of Strategies and Policies Toward AI-Resistant. Journal of Sociology and Education, 1(6). https://doi.org/10.63887/jse.2025.1.6.15

Subscription content Buy the paper to read

AACS Journals
Visitor:-

Copyright © 2020 AACS All rights reserved