DOI: 10.17587/prin.14.137-145
Comparison of Tracking Approaches Distribution of Attention to Operators with Use of Oculographic Interfaces
Ya. A. Turovsky, Leading Researcher, yaroslav_turovsk@mail.ru,
V. A. Trapeznikov Institute of Control Sciences of Russian Academy of Sciences,
Moscow, 117997, Russian Federation,
V. Yu. Alekseev, Postgraduate Student, va413@mail.ru,
Voronezh State University, Voronezh, 394018, Russian Federation
Corresponding author: Yaroslav A. Turovsky, Leading Researcher, Trapeznikov Institute of Control Sciences of Russian Academy of Sciences, Moscow, 117997, Russian Federation E-mail: yaroslav_turovsk@mail.ru
Received on January 09, 2023
Accepted on January 31, 2023
The results of a study of two systems for monitoring the attention of the operator, based on a wearable oculointerface using infrared sensors to predict the deviation of the operators pupil, as well as using remote video oculography, which predicts the direction of the operators gaze from the image from a stationary monocular video camera, are presented. It is shown that in the course of the experiments, the wearable oculointerface had a lower average recognition error than the remote oculography interface, however, due to the design features, it is less convenient for the end user.
Keywords: oculography, video oculology, oculointerface, ergatic systems, distribution of attention, monitoring of the area of attention, digital monitoring, human-computer interface
pp. 138–146
For citation:
Turovsky Ya. A., Alekseev V. Yu. Comparison of Tracking Approaches Distribution of Attention to Operators with Use of Oculographic Interfaces, Programmnaya Ingeneria, 2023, vol. 14, no. 3, pp. 137—145. DOI: 10.17587/ prin.14.137-145 (in Russian).
References:
- Barabanshchikov V. A., Zhegallo A. V. Eye Movement Recording Methods: Theory and Practice, Psihologicheskaya nauka i obrazovaniepsyedu.ru, 2010, vol. 2, no. 5, pp. 240—254 (in Russian).
- Blaginin A. A., Sinelnikov S. N., Naturalnikov I. O. et al. Evaluation of the features of the distribution of attention of operators using the method of stationary eye-tracking, Vestnik psihofiziologii, 2019, no. 3, pp. 89—91 (in Russian).
- Arar N. M. Robust Eye Tracking Based on Adaptive Fusion of Multiple Cameras. Ecole polytechnique federale de lausanne, 2017, 168 p. DOI: 10.5075/epfl-thesis-7933.
- Turovskyi Ya. A., Alekseev A. V., Lesnyh I. E., Martinenko E. V. Evaluation of fatigue when using the video-oculographic interface in tasks of controlling ergatic systems, Epistemologicheskie osnovaniya sovremennogo obrazovaniya: aktual'nye voprosy prod-vizheniya fundamental'nogo znaniya v uchebny- process: materialy Mezhdunarodno- nauchno-praktichesko- konferencii 2020 Borisoglebskogo filiala FGBOU VO «VGU». Borisoglebsk, 2020, pp. 406—410 (in Russian).
- Turovskyi Ya. A., AlekseevA. V., Lesnyh I. E., Martinenko E. V. Frequency-temporal features of eye movement when using a video-oculographic interface in tasks of controlling ergatic systems, Sensornye sistemy, 2021, vol. 35, no. 1, pp. 30—37. DOI: 10.31857/ S0235009221010091 (in Russian).
- Li Zh., Guo P., Song Ch. A review of main eye movement tracking methods, Journal of Physics: Conference Series, 2021, vol. 1802, pp. 042066. DOI:10.1088/1742-6596/1802/4/042066.
- Microsoft .NetFramework, available at: https://dotnet.mi-crosoft.com/ (date of access 09.01.2023).
- Math.NetNumeric, available at: https://numerics.mathdot-net.com/ (date of access 09.01.2023).
- Panda3d, the open-source framework for 3D rendering and games, available at: https://www.panda3d.org/ (date of access 09.01.2023).
- OpenCV computer vision library, available at: https://docs. opencv.org/3.4/da/d60/tutorial_face_main.html (date of access 09.01.2023).
- Mediapipe computer vision library, available at: https:// google.github.io/mediapipe/solutions/face_mesh (date of access 09.01.2023).
- Grishchenko I., Ablavatski A., Kartynnik Yu. et al. Attention Mesh: High-fidelity Face Mesh Prediction in Real-time, CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020, available at: https://arxiv.org/abs/2006.10962 (date of access 09.01.2023).
- Alter T. D. 3D pose from 3 corresponding points under weak-perspective pro-ection, Cambridge, Mass, Massachusetts Institute of Technology, Artificial Intelligence Laboratory. A. I. Memo No. 1378, 1992, 40 p.