Embedding 3-D Gaze Points on a 3-D Visual Field: A Case of Transparency

Authors

  • Fatima Isiaka Department of Computer Science, Nasarawa State University, Keffi, Nigeria
  • Zainab Adamu Department of Computer Science, Ahmadu Bello University, Zaria, Nigeria
  • Muhammad A. Adamu Department of Electrical Engineering, Federal University of Technology, Minna, Nigeria

DOI:

https://doi.org/10.30564/jcsr.v4i1.4037

Abstract

The paper seeks to demonstrates the likelihood of embedding a 3D gaze point on a 3D visual field, the visual field is inform of a game console where the user has to play from one level to the other by overcoming obstacles that will lead them to the next level. Complex game interface is sometimes difficult for the player to progress to next level of the game and the developers also find it difficult to regulate the game for an average player. The model serves as an analytical tool for game adaptations and also players can track their response to the game. Custom eye tracking and 3D object tracking algorithms were developed to enhance the analysis of the procedure. This is a part of the contributions to user interface design in the aspect of visual transparency. The development and testing of human computer interaction uses and application is more easily investigated than ever, part of the contribution to this is the embedding of 3-D gaze point on a 3-D visual field. This could be used in a number of applications, for instance in medical applications that includes long and short sightedness diagnosis and treatment. Experiments and Test were conducted on five different episodes of user attributes, result show that fixation points and pupil changes are the two most likely user attributes that contributes most significantly in the performance of the custom eye tracking algorithm the study. As the advancement in development of eye movement algorithm continues user attributes that showed the least likely appearance will prove to be redundant.

Keywords:

User Behaviour; 3D gaze point; Eye movement; User behaviour; 3D visual interface; 3D game console; User experience

References

[1] Wang, X.Y., Zhang, Y., Fu, X.J., Xiang, G.S., 2008. Design and kinematic analysis of a novel humanoid robot eye using pneumatic artificial muscles. Journal of Bionic Engineering. 5(3), 264-270.

[2] Leigh, R.J., Zee, D.S., 2015. The neurology of eye movements. Contemporary Neurology.

[3] Soechting, J.F., Flanders, M., 1992. Moving in three-dimensional space: frames of reference, vectors, and coordinate systems. Annual review of neuroscience. 15(1), 167-191.

[4] Donaldson, I.M.L., 2000. The functions of the proprioceptors of the eye muscles. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences. 355(1404), 1685-1754.

[5] Yoshimura, Y., Kizuka, T., Ono, S., 2021. Properties of fast vergence eye movements and horizontal saccades in athletes. Physiology & Behavior. 235, 113397.

[6] Gupta, P., Beylergil, S., Murray, J., Kilbane, C., Ghasia, F.F., Shaikh, A.G., 2021. Computational models to delineate 3D gaze-shift strategies in Parkinson's disease. Journal of Neural Engineering. 18(4), 0460a5.

[7] McCormack, G.L., Kulowski, K.A., 2021. Image Size and the Range of Clear and Single Binocular Vision in 3D Displays. Optometry and Vision Science. 98(8), 947-958.

[8] Chen, C.Y., Lin, H.C., Wu, P.J., Chuang, C.H., Lin, B.S., Lin, C.H., 2021. Reducing the discomfort in viewing 3D video with a prism device modified eye convergence. Heliyon. 7(4), e06877.

[9] Rajendran, S.K., Wei, Q., Zhang, F., 2021. Two degree-of-freedom robotic eye: design, modeling, and learning-based control in foveation and smooth pursuit. Bioinspiration & Biomimetics.

[10] Samadani, U., Ritlop, R., Reyes, M., Nehrbass, E., Li, M., Lamm, E., ... Huang, P., 2015. Eye tracking detects disconjugate eye movements associated with structural traumatic brain injury and concussion. Journal of neurotrauma. 32(8), 548-556.

[11] Singh, T., Perry, C.M., Herter, T.M., 2016. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment. Journal of neuroengineering and rehabilitation. 13(1), 1-17.

[12] Wöhle, L., Gebhard, M., 2021. Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head-and Eye-Gaze Interface. Sensors. 21(5), 1798.

[13] Yeamkuan, S., Chamnongthai, K., 2021. 3D Point-of-Intention Determination Using a Multimodal Fusion of Hand Pointing and Eye Gaze for a 3D Display. Sensors. 21(4), 1155.

[14] Li, Y., Shen, W., Gao, Z., Zhu, Y., Zhai, G., Guo, G., 2021. Looking Here or There? Gaze Following in 360-Degree Images. In Proceedings of the IEEE/ CVF International Conference on Computer Vision. pp. 3742-3751.

[15] Zhang, K., Liu, H., Fan, Z., Chen, X., Leng, Y., de Silva, C.W., Fu, C., 2021. Foot placement prediction for assistive walking by fusing sequential 3D gaze and environmental context. IEEE Robotics and Automation Letters. 6(2), 2509-2516.

[16] Fang, Y., Tang, J., Shen, W., Shen, W., Gu, X., Song, L., Zhai, G., 2021. Dual Attention Guided Gaze Target Detection in the Wild. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. pp. 11390-11399.

[17] Liang, J., Perez-Rathke, A., 2021. Minimalistic 3D chromatin models: Sparse interactions in single cells drive the chromatin fold and form many-body units. Current Opinion in Structural Biology. 71, 200-214.

[18] Liang, J., Perez-Rathke, A., 2021. Minimalistic 3D chromatin models: Sparse interactions in single cells drive the chromatin fold and form many-body units. Current Opinion in Structural Biology. 71, 200-214.

[19] Wong, H., Marie-Nelly, H., Herbert, S., Carrivain, P., Blanc, H., Koszul, R., ... Zimmer, C., 2012. A predictive computational model of the dynamic 3D interphase yeast nucleus. Current biology. 22(20), 1881- 1890.

[20] Mousa, M., Dong, Y., 2020. Towards sophisticated 3D interphase modelling of advanced bionanocomposites via atomic force microscopy. Journal of Nanomaterials.

[21] Ou, H.D., Phan, S., Deerinck, T.J., Thor, A., Ellisman, M.H., O' shea, C.C., 2017. ChromEMT: Visualizing 3D chromatin structure and compaction in interphase and mitotic cells. Science. 357(6349).

[22] Mortazavi, B., Bardon, J., Ahzi, S., 2013. Interphase effect on the elastic and thermal conductivity response of polymer nanocomposite materials: 3D finite element study. Computational Materials Science. 69, 100-106.

[23] Chen, C.P., Zhang, C.Y., 2014. Data-intensive applications, challenges, techniques and technologies: A survey on Big Data. Information sciences. 275, 314- 347.

[24] Markl, M., Draney, M.T., Hope, M.D., Levin, J.M., Chan, F.P., Alley, M.T., ... Herfkens, R.J., 2004. Time-resolved 3-dimensional velocity mapping in the thoracic aorta: visualization of 3-directional blood flow patterns in healthy volunteers and patients. Journal of computer assisted tomography. 28(4), 459-468.

[25] Keim, D., Kohlhammer, J., Ellis, G., Mansmann, F., 2010. Mastering the information age: solving problems with visual analytics.

[26] Ndung'u, R.N., Kamau, G.N., Wambugu, G.M., 2021. Using Feature Selection Methods to Discover Common Users' Preferences for Online Recommender Systems.

[27] Horstmann, K.T., Rauthmann, J.F., Sherman, R.A., Ziegler, M., 2021. Unveiling an exclusive link: Predicting behavior with personality, situation perception, and affect in a preregistered experience sampling study. Journal of Personality and Social Psychology. 120(5), 1317.

Downloads

How to Cite

Isiaka, F., Adamu, Z., & Adamu, M. A. (2022). Embedding 3-D Gaze Points on a 3-D Visual Field: A Case of Transparency. Journal of Computer Science Research, 4(1), 1–9. https://doi.org/10.30564/jcsr.v4i1.4037

Issue

Article Type

Article