The second set of coordinates-giving gaze position-is obtained via the YOLO (v3) package. The first set of coordinates-the position of the face relative to the computer, is implemented by detecting color from the infrared LED via the OpenCV library. Specifically, a dual coordinate system is given for controlling the computer with the help of a gaze. Implementation is in Python where its application is demonstrated by controlling interaction with the computer. We introduce a new eye-tracking approach where the effectiveness of using a deep learning method is significantly increased. Various models of deep neural networks that can be involved in the process of online gaze monitoring are reviewed. We present a practical implementation of the most popular methods for tracking gaze. The paper presents a detailed analysis of modern techniques that can be used to track gaze with a webcam.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |