Real time Gaze Tracking in Remote Proctoring: A Study of Appearance-based Gaze Estimation

Show simple item record

dc.contributor.author Onim, Nafiul
dc.contributor.author Shahid, Mirza Sadaf
dc.contributor.author Quayes, Muhammad Rafsan
dc.date.accessioned 2024-09-05T10:12:18Z
dc.date.available 2024-09-05T10:12:18Z
dc.date.issued 2023-05-30
dc.identifier.citation [1] K. Krafka, A. Khosla, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik, and A. Tor ralba, “Eye tracking for everyone,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 2176–2184. [2] A. A. Abdelrahman, T. Hempel, A. Khalifa, and A. Al-Hamadi, “L2cs-net: Fine-grained gaze estimation in unconstrained environments,” arXiv preprint arXiv:2203.03339, 2022. [3] N. Bressler. (2022) What are neural networks? [Online]. Available: https: //www.ibm.com/topics/neural-networks [4] M. Mishra. (2020) Convolutional neural networks, explained. [Online]. Available: https://towardsdatascience.com/convolutional-neural-networks-explained-9cc5188c4939 [5] aditianu1998. (2023) Understanding of lstm networks. [Online]. Available: https: //www.geeksforgeeks.org/understanding-of-lstm-networks/ [6] C. Thombare, K. Sapate, A. Rane, and A. Hutke, “Proctoring system,” Journal homepage: www. ijrpr. com ISSN, vol. 2582, p. 7421. [7] M. Makame, “Towards assured informed consent in privacy notice design: An eye movement detection approach,” Ph.D. dissertation, 06 2016. [8] K. Mali. (2023) What is linear regression? [On line]. Available: https://www.analyticsvidhya.com/blog/2021/10/ everything-you-need-to-know-about-linear-regression/ [9] H. Belyadi and A. Haghighat, “Chapter 5 - supervised learning,” in Machine Learning Guide for Oil and Gas Using Python, H. Belyadi and A. Haghighat, Eds. Gulf Professional Publishing, 2021, pp. 169–295. [Online]. Available: https://www. sciencedirect.com/science/article/pii/B9780128219294000044 [10] H. Bonthu. (2021) An introduction to logistic regression. [Online]. Available: https://www.analyticsvidhya.com/blog/2021/07/an-introduction-to-logistic-regression/ 49 Bibliography 50 [11] R. Kareem kadthim and Z. H. Ali, “Survey: Cheating detection in online exams,” Interna tional Journal of Engineering Research and Advanced Technology, vol. 08, no. 01, p. 01–05, 2022. [12] A. Balderas and J. A. Caballero-Hern´andez, “Analysis of learning records to detect student cheating on online exams: Case study during covid-19 pandemic,” in Eighth international conference on technological ecosystems for enhancing multiculturality, 2020, pp. 752–757. [13] O. R. Harmon and J. Lambrinos, “Are online exams an invitation to cheat?” The Journal of Economic Education, vol. 39, no. 2, pp. 116–125, 2008. [14] R. Raman, H. Vachharajani, P. Nedungadi et al., “Adoption of online proctored examina tions by university students during covid-19: Innovation diffusion study,” Education and information technologies, vol. 26, no. 6, pp. 7339–7358, 2021. [15] R. M. Al airaji, I. A. Aljazaery, H. T. Alrikabi, and A. H. M. Alaidi, “Automated cheating detection based on video surveillance in the examination classes,” International Journal of Interactive Mobile Technologies (iJIM), vol. 16, no. 08, p. pp. 124–137, 04 2022. [Online]. Available: https://online-journals.org/index.php/i-jim/article/view/30157 [16] A. Lee-Post and H. Hapke, “Online learning integrity approaches: Current practices and future solutions.” Online Learning, vol. 21, no. 1, pp. 135–145, 2017. [17] S. Kaddoura and A. Gumaei, “Towards effective and efficient online exam systems using deep learning-based cheating detection approach,” Intelligent Systems with Applications, vol. 16, p. 200153, 2022. [Online]. Available: https://www.sciencedirect.com/science/ article/pii/S2667305322000904 [18] T. Potluri, “An automated online proctoring system using attentive-net to assess student mischievous behavior,” Multimedia Tools and Applications, pp. 1–30, 2023. [19] S. Govind, “Webcam based eye-gaze estimation,” International Journal of Engineering Applied Sciences and Technology, 2019. [20] Y.-T. Lin, R.-Y. Lin, Y.-C. Lin, and G. C. Lee, “Real-time eye-gaze estimation using a low-resolution webcam,” Multimedia tools and applications, vol. 65, no. 3, pp. 543–568, 2013. [21] N. H. Cuong and H. T. Hoang, “Eye-gaze detection with a single webcam based on geom etry features extraction,” in 2010 11th International Conference on Control Automation Robotics & Vision, 2010, pp. 2507–2512. [22] E. Wood, T. Baltruˇsaitis, L.-P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” in Proceedings of Bibliography 51 the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ser. ETRA ’16. New York, NY, USA: Association for Computing Machinery, 2016, p. 131–138. [Online]. Available: https://doi.org/10.1145/2857491.2857492 [23] A. Gudi, X. Li, and J. van Gemert, “Efficiency in real-time webcam gaze tracking,” in Computer Vision – ECCV 2020 Workshops, A. Bartoli and A. Fusiello, Eds. Cham: Springer International Publishing, 2020, pp. 529–543. [24] S. Park, E. Aksan, X. Zhang, and O. Hilliges, “Towards end-to-end video-based eye tracking,” in European Conference on Computer Vision (ECCV), 2020. [25] H. Khachatryan and A. L. Rihn. (2017) Eye-tracking methodology and applications in consumer research. [Online]. Available: https://edis.ifas.ufl.edu/publication/FE947 [26] Y. R. Pratama, S. Atin, and I. Afrianto, “Predicting student interests against laptop specifications through application of data mining using c4.5 algorithms,” IOP Conference Series: Materials Science and Engineering, vol. 662, no. 2, p. 022129, 11 2019. [Online]. Available: https://dx.doi.org/10.1088/1757-899X/662/2/022129 [27] A. Nigam, R. Pasricha, T. Singh, and P. Churi, “A systematic review on ai-based proctoring systems: Past, present and future,” Education and Information Technologies, vol. 26, no. 5, pp. 6421–6445, 2021. [28] F. Klijn, M. Mdaghri Alaoui, and M. Vorsatz, “Academic integrity in on-line exams: Evidence from a randomized field experiment,” Journal of Economic Psychology, vol. 93, p. 102555, 2022. [Online]. Available: https://www.sciencedirect.com/science/article/pii/ S0167487022000666 [29] B. Burgess, A. Ginsberg, E. W. Felten, and S. Cohney, “Watching the watchers: bias and vulnerability in remote proctoring software,” in 31st USENIX Security Symposium (USENIX Security 22), 2022, pp. 571–588. [30] M. F. Ansari, P. Kasprowski, and M. Obetkal, “Gaze tracking using an unmodified web camera and convolutional neural network,” Applied Sciences, vol. 11, no. 19, 2021. [Online]. Available: https://www.mdpi.com/2076-3417/11/19/9068 [31] N. Dilini, A. Senaratne, T. Yasarathna, N. Warnajith, and L. Seneviratne, “Cheating detection in browser-based online exams through eye gaze tracking,” 12 2021, pp. 1–8. [32] M.-T. Vo and S. G. Kong, “Enhanced gaze tracking using convolutional long short-term memory networks,” International Journal of Fuzzy Logic and Intelligent Systems, vol. 22, no. 2, pp. 117–127, 2022. Bibliography 52 [33] Y.-m. Cheung and Q. Peng, “Eye gaze tracking with a web camera in a desktop envi ronment,” IEEE Transactions on Human-Machine Systems, vol. 45, no. 4, pp. 419–430, 2015. [34] X. Zhou, J. Lin, Z. Zhang, Z. Shao, S. Chen, and H. Liu, “Improved itracker combined with bidirectional long short-term memory for 3d gaze estimation using appearance cues,” Neurocomputing, vol. 390, pp. 217–225, 2020. [35] B.-J. Hwang, H.-H. Chen, C.-H. Hsieh, and D.-Y. Huang, “Gaze tracking based on con catenating spatial-temporal features,” Sensors, vol. 22, no. 2, p. 545, 2022. [36] N. Aunsri and S. Rattarom, “Novel eye-based features for head pose-free gaze estimation with web camera: New model and low-cost device,” Ain Shams Engineering Journal, vol. 13, no. 5, p. 101731, 2022. [Online]. Available: https: //www.sciencedirect.com/science/article/pii/S2090447922000429 [37] A. Dix, Human-computer Interaction. Prentice Hall Europe, 1998. [Online]. Available: https://books.google.com.bd/books?id=tNxQAAAAMAAJ [38] M. Kowalik, “Do-it-yourself eye tracker: impact of the viewing angle on the eye tracking accuracy,” Proceedings of CESCG, pp. 1–7, 2011. [39] N. Bressler. (2022) MS Windows NT kernel description. [Online]. Available: https: //deepchecks.com/how-to-check-the-accuracy-of-your-machine-learning-model [40] K. A. Funes Mora, F. Monay, and J.-M. Odobez, “Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras,” in Proceedings of the Symposium on Eye Tracking Research and Applications, ser. ETRA ’14. New York, NY, USA: Association for Computing Machinery, 2014, p. 255–258. [Online]. Available: https://doi.org/10.1145/2578153.2578190 [41] X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, “It’s written all over your face: Full-face appearance-based gaze estimation,” in Computer Vision and Pattern Recognition Work shops (CVPRW), 2017 IEEE Conference on. IEEE, 2017, pp. 2299–2308. [42] Y. Cheng, H. Wang, Y. Bao, and F. Lu, “Appearance-based gaze estimation with deep learning: A review and benchmark,” arXiv preprint arXiv:2104.12668, 2021. [43] Y. Cheng, “itracker implementation on mpiigaze,” https://github.com/yihuacheng/ Itracker, 2021. [44] A. A.Abdelrahman, “L2cs-net,” https://github.com/Ahmednull/L2CS-Net, 2022. [45] R. Booth and U. Weger, “The function of regressions in reading: Backward eye movements allow rereading,” Memory cognition, vol. 41, 08 2012 en_US
dc.identifier.uri http://hdl.handle.net/123456789/2164
dc.description Supervised by Dr. Hasan Mahmud, Associate Professor, Mr. Fardin Saad, Lecturer, Dr. Md. Kamrul Hasan Professor, Department of Computer Science and Engineering(CSE), Islamic University of Technology(IUT), Board Bazar, Gazipur-1704, Bangladesh en_US
dc.description.abstract Our thesis aims to address the critical issue of academic dishonesty in online examinations by proposing a proctoring system that integrates eye gaze tracking technology for the detection of suspicious behavior. The study begins by discussing the existing challenges of current ex amination systems and identifying the problems that need to be addressed. It emphasizes the necessity for a more advanced proctoring system with gaze tracking capabilities to effectively deter attempts at academic dishonesty. The research is divided into two main parts: the selec tion of an appropriate model and the incorporation of proctoring functionalities. Two models were chosen for evaluation, namely iTracker, which was pre-trained on the GazeCapture dataset, and L2cs-net, which we trained on the MPIIFaceGaze dataset. The findings from these exper iments indicate that L2cs-net outperforms iTracker in terms of accuracy, speed, and latency but only when supplied with the processing power of a GPU, without one iTracker is better. Regarding the proctoring system aspect, it is noted that most of the existing research is com mercially driven, with limited academic contributions. To optimize the proctoring system for online exams, we recognize the significant value of examinees’ eye gaze and define important regions on and off the screen through calibration using “magic pixels”. Moreover, we attribute cheating criteria using a formulated equation that takes into account factors such as Count, Frequency, Duration, and Regression. Two potential approaches for the proctoring system, namely Thresholding and Machine Learning (ML), are considered. However, our focus lies on the development of a thresholding-based approach. Overall, this thesis presents a comprehensive exploration of academic dishonesty in online examinations, proposes a proctoring system using eye gaze tracking technology, and compares the performance of different models and method ologies. The findings contribute to the advancement of proctoring systems and provide insights for the development of more effective measures against academic dishonesty. en_US
dc.language.iso en en_US
dc.publisher Department of Computer Science and Engineering(CSE), Islamic University of Technology(IUT), Board Bazar, Gazipur-1704, Bangladesh en_US
dc.subject Smart Proctoring Systems, Academic Dishonesty, Malicious Intent Prediction, Neural Networks, CNN, Appearance Based Models, Eye Gaze Tracking en_US
dc.title Real time Gaze Tracking in Remote Proctoring: A Study of Appearance-based Gaze Estimation en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search IUT Repository


Advanced Search

Browse

My Account

Statistics