[1] REN F,BAO Y.A review on human-computer interaction and intelligent robots[J].International Journal of Information Technology & Decision Making,2020,19(1):5-47.
[2] LAZAR J,FENG J H,HOCHHEISER H.Research methods in human-computer interaction[M].Burlington:Morgan Kaufmann,2017.
[7] YU D J,KOU G,XU Z S,et al.Analysis of collaboration evolution in AHP research:1982-2018[J].International Journal of Information Technology & Decision Making, 2021,20(1):7-36.
[8] AL-SA′DI A,AL-SAMARRAIE H.A Delphi evaluation of user interface design guidelines:the case of arabic[J].Advances in Human-Computer Interaction,2022,2022.doi:10.1155/2022/5492230.
[9] AKYEAMPONG J,UDOKA S,CARUSO G,et al.Evaluation of hydraulic excavator human-machine interface concepts using NASA TLX[J].International Journal of Industrial Ergonomics,2014,44(3):374-382.
[10] KUMAR N,KUMAR J.Measurement of cognitive load in HCI systems using EEG power spectrum:an experimental study[J].Procedia Computer Science,2016,84:70-78.
[11] DIKMEN M,BURNS C.The effects of domain knowledge on trust in explainable AI and task performance:a case of peer-to-peer lending[J].International Journal of Human-Computer Studies,2022,162:102792.
[12] OBERHAUSER M,DREYER D.A virtual reality flight simulator for human factors engineering[J].Cognition,Technology & Work,2017,19:263-277.
[13] PENA J,NPOLES G,SALGUEIRO Y.Explicit methods for attribute weighting in multi-attribute decision-making:a review study[J].Artificial Intelligence Review, 2020,53:3127-3152.
[14] VAIRAMUTHU S,ANOUNCIA S M.Design of near optimal user interface with minimal UI elements using evidence based recommendations and multi criteria decision making:TOPSIS method[J].International Journal of Humanitarian Technology,2018,1(1):40-65.
[15] MANLY B F J,ALBERTO J A N.Multivariate statistical methods:a primer[M].London:Chapman and Hall,2016.
[19] LI W C,ZAKARIJA M,YU C S,et al.Interface design on cabin pressurization system affecting pilots situation awareness:the comparison between digital displays and pointed displays[J].Human Factors and Ergonomics in Manufacturing & Service Industries,2020,30(2):103-113.
[20] CHEN Y,YAN S Y,TRAN C C.Comprehensive evaluation method for user interface design in nuclear power plant based on mental workload[J].Nuclear Engineering and Technology,2019,51(2):453-462.
[21] GAO L L,LIU Y J,LIU R,et al.Design and implementation of human-machine interaction (HMI) availability assessment system[C]//The 2nd International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI).Shenyang:IEEE,2021:220-230.
[23] CARTER B T,LUKE S G.Best practices in eye tracking research[J].International Journal of Psychophysiology, 2020,155:49-62.
[24] VALTAKARI N V,HOOGE I T C,VIKTORSSON C,et al.Eye tracking in human interaction:possibilities and limitations[J].Behavior Research Methods,2021,53(6):1592-1608.
[25] MARTINEZ-MARQUEZ D,PINGALI S,PANUWATWA-NICH K,et al.Application of eye tracking technology in aviation, maritime,and construction industries:a systematic review[J].Sensors,2021,21(13):4289.
[26] OYEKUNLE R,BELLO O,JUBRIL Q,et al.Usability evaluation using eye-tracking on e-commerce and education domains[J].Journal of Information Technology and Computing,2020,1(1):1-13.
[27] KHOSRAVAN N,CELIK H,TURKBEY B,et al.A collaborative computer aided diagnosis (C-CAD) system with eye-tracking,sparse attentional model,and deep learning[J].Medical Image Analysis,2019,51:101-115.
[32] CHEN S,EPPS J.Using task-induced pupil diameter and blink rate to infer cognitive load[J].Human-Computer Interaction,2014,29(4):390-413.
[34] TIAN C Z,SONG M,TIAN J W,et al.Evaluation of air combat control ability based on eye movement indicators and combination weighting GRA-TOPSIS[J].Aerospace,2023,10(5):437.
[36] POOLE A,BALL L J.Eye tracking in human-computer interaction and usability research:current status and future prospects[M]//NORMAN D.Encyclopedia of human-computer interaction.Colombo:IDEA Group,2016:211-219.
[37] JOSEPH A W,MURUGESH R.Potential eye tracking metrics and indicators to measure cognitive load in human-computer interaction research[J].Journal of Scientific Research,2020,64(1):168-175.
[39] HAULAND G.Measuring team situation awareness by means of eye movement data[M]//HARRIS D,DUFFY V,SMITH M,et al.Human-centered computing.Broken Sound:CRC Press,2019:230-234.
[40] HUTTON S B.Eye tracking methodology[M]//KLEIN C,ETTINGER U.Eye movement research.Cham:Springer,2019:277-308.
[41] WEDEL M,PIETERS R.A review of eye-tracking research in marketing[J].Review of Marketing Research,2017:123-147.
[42] CHENNAMMA H R,YUAN X H.A survey on eye-gaze tracking techniques[J].Indian Journal of Computer Science and Engineering (IJCSE),2013,4(5):388-393.
[43] MONTY R A,SENDERS J W.Eye movements and psychological processes[M].London:Routledge,2017.
[44] KLAIB A F,ALSREHIN N O,MELHEM W Y,et al.Eye tracking algorithms,techniques,tools,and applications with an emphasis on machine learning and internet of things technologies[J].Expert Systems with Applications,2021, 166:114037.
[45] MARTINIKORENA I,CABEZA R,VILLANUEVA A,et al.Fast and robust ellipse detection algorithm for head-mounted eye tracking systems[J].Machine Vision and Applications,2018,29:845-860.
[47] LIPP B,DICKEL S.Interfacing the human/machine[J].Distinktion:Journal of Social Theory,2023,24(3):425-443.
[48] SHNEIDERMAN B,PLAISANT C,COHEN M S,et al.Designing the user interface:strategies for effective human-computer interaction[M].Boston:Addison-Wesley Longman Publishing Co.,Inc.,2016.
[49] LODGAARD E,DRANSFELD S.Organizational aspects for successful integration of human-machine interaction in the industry 4.0 era[J].Procedia CIRP,2020,88:218-222.
[50] DIX A.Human-computer interaction,foundations and new paradigms[J].Journal of Visual Languages & Computing,2017,42:122-134.
[54] WANG Y Y,LIU Q F,XIONG D Q,et al.Research on assessment of eye movement sensitivity index through aircraft cockpit man-machine interface based on eye movement tracking technology[C]//Proceedings of the 15th International Conference on Man-Machine-Environment System Engineering.Berlin:Springer,2015:495-502.
[55] ZHANG L,ZHUANG D M,WANYAN X R.Information coding for cockpit human-machine interface[J].Chinese Journal of Mechanical Engineering,2011,24(4):707-712.
[60] ZHANG M M,HOU G H,CHEN Y C.Effects of interface layout design on mobile learning efficiency:a comparison of interface layouts for mobile learning platform[J].Library Hi Tech,2023,41(5):1420-1435.
[61] YAN S Y,WEI Y Y,TRAN C C.Evaluation and prediction mental workload in user interface of maritime operations using eye response[J].International Journal of Industrial Ergonomics,2019,71:117-127.
[62] GODFROID A,HUI B.Five common pitfalls in eye-tracking research[J].Second Language Research,2020,36(3):277-305.
[63] MAJARANTA P,BULLING A.Eye tracking and eye-based human-computer interaction[M]//FAIRCLOUGH S H,GILLEADE K.Advances in physiological computing.Berlin:Springer,2014:39-65.
[64] SCOTT N,ZHANG R,LE D,et al.A review of eye-tracking research in tourism[J].Current Issues in Tourism,2019,22(10):1244-1261.
[65] PARISAY M,POULLIS C,KERSTEN M.Eyetap:a novel technique using voice inputs to address the midas touch problem for gaze-based interactions[R].Los Alamos:arXiv Preprint,2020:arXiv:2002.08455.
[66] PARISAY M,POULLIS C,KERSTEN-OERTEL M.EyeTAP:introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection tech niques[J].International Journal of Human-Computer Studies, 2021, 154:102676.
[70] CHENG S W,WEI Q J.Design preferred aesthetic user interface with eye movement and electroencephalography data[C]//Proceedings of the 2018 ACM Companion International Conference on Interactive Surfaces and Spaces.New York:Association for Computing Machinery,2018:39-45.
[71] ZHOU C,YUAN F F,HUANG T,et al.The impact of interface design element features on task performance in older adults:evidence from eye-tracking and EEG signals[J].International Journal of Environmental Research and Public Health,2022,19(15):9251.
[73] BALCHI S,MIRARAB A.A new view at usability test methods of interfaces for human computer interaction[J].Global Journal of Computer Science and Technology, 2015,15:16-24.
[74] ISSA T,ISAIAS P.Usability and human-computer interaction(HCI)[M]//ISSA T,ISAIAS P.Sustainable design:HCI,usability and environmental concerns.London:Springer,2022.
[75] COHEN M X.Where does EEG come from and what does it mean?[J].Trends in Neurosciences,2017,40(4):208-218.
[76] CHARLES R L,NIXON J.Measuring mental workload using physiological measures:a systematic review[J].Applied Ergonomics,2019,74:221-232.
[77] KRAMER A F.Physiological metrics of mental workload:a review of recent progress[R].San Diego:Navy Personnel Research and Development Center,1990.
[78] MEINER M,PFEIFFER J,PFEIFFER T,et al.Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research[J].Journal of Business Research,2019,100:445-458.
[79] RAZHEVA A V,ROZALIEV V L,ORLOVA Y A.Modern eye tracking research and technology[J].Information Innovative Technologies,2018,15(1):229-235.
[80] CLAY V,KNIG P,KOENIG S.Eye tracking in virtual reality[J].Journal of Eye Movement Research,2019,12(1):PMC7903250.
[81] RADACH R,KENNEDY A.Eye movements in reading:some theoretical context[J].Quarterly Journal of Experimental Psychology,2013,66(3):429-452.
[82] HORSLEY M,ELIOT M,KNIGHT B A,et al.Current trends in eye tracking research[M].Berlin:Springer,2013.
[83] GEGENFURTNER K R.The interaction between vision and eye movements[J].Perception,2016,45(12):1333-1357.
[85] RIGAS I,KOMOGORTSEV O,SHADMEHR R.Biometric recognition via eye movements:saccadic vigor and acceleration cues[J].ACM Transactions on Applied Perception (TAP),2016,13(2):1-21.
[86] DE KLOE Y J R,HOOGE I T C,KEMNER C,et al.Replacing eye trackers in ongoing studies:a comparison of eye-tracking data quality between the Tobii Pro TX300 and the Tobii Pro spectrum[J].Infancy,2022,27(1):25-45.
[88] RAKHMATULIN I,DUCHOWSKI AT.Deep neural networks for low-cost eye tracking[J].Procedia Computer Science,2020,176:685-694.
[90] KIM M,KIM BH,JO S.Quantitative evaluation ofa low-cost noninvasivehybrid interface based on EEG and eye movement[J].IEEE Transactions on Neural Systems and Rehabilitation Engineering,2014,23(2):159-168.
[91] GIBALDIA,VANEGAS M,BEX P J,et al.Evaluation of the Tobii EyeX eye tracking contoller and Matlab toolkit for research[J].Behavior Research Methods,2017,49:923-946.
[92] EHINGER BV,GROB K,IBS I,et al.A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000[J].PeerJ,2019,7(2): e7086.
[93] H00GE IT C,HESSEIS R S,NYSTR?M M.Do pupilbased binocular video eye trackers reliably measure vergence?[J].Vision Research,2019,156:1-9
[94] NIEHORSTER D C,SANTINI T,HESSELS R S,et al. The impact of slippage on the data quality of head-worn eye trackers[J].Behavior Research Methods,2020,52: 1140-1160.
[95] TSAI Y S,CHEN N C,HSIEH YZ,et al.The development of long-distance viewing direction analysis and recognition of observed objects using head image and deep learning[J].Mathematics,2021,9(16):1880.
[98] FORRESTER JV,DICK A D,MCMENAMIN P,et al.The eye[M].Amsterdam:Elsevier,2021.
[99] DE WINTER JC F,EISMA Y B,CABRALL C D D,et al.Situation awareness based on eye movements in relation to the task environment[J].Cognition,Technology & Work,2019,21(1):99-111.