EFG-Net: A Unified Framework for Estimating Eye Gaze and Face Gaze Simultaneously
2022
会议录名称LECTURE NOTES IN COMPUTER SCIENCE (INCLUDING SUBSERIES LECTURE NOTES IN ARTIFICIAL INTELLIGENCE AND LECTURE NOTES IN BIOINFORMATICS)
ISSN0302-9743
卷号13534 LNCS
页码552-565
发表状态已发表
DOI10.1007/978-3-031-18907-4_43
摘要

Gaze is of vital importance for understanding human purpose and intention. Recent works have gained tremendous progress in appearance-based gaze estimation. However, all these works deal with eye gaze estimation or face gaze estimation separately, ignoring the mutual benefit of the fact that eye gaze and face gaze are roughly the same with a slight difference in the starting point. For the first time, we propose an Eye gaze and Face Gaze Network (EFG-Net), which makes eye gaze estimation and face gaze estimation take advantage of each other, leading to a win-win situation. Our EFG-Net consists of three feature extractors, a feature communication module named GazeMixer, and three predicting heads. The GazeMixer is designed to propagate coarse gaze features from face gaze to eye gaze and fine gaze features from eye gaze to face gaze. The predicting heads are capable of estimating gazes from the corresponding features more finely and stably. Experiments show that our method achieves state-of-the-art performance of 3.90° (by ∼ 4% ) eye gaze error and 3.93° (by ∼ 2% ) face gaze error on MPIIFaceGaze dataset, 3.03° eye gaze error and 3.17° (by ∼ 5% ) face gaze error on GazeCapture dataset respectively. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.

关键词Appearance based Communication modules Eye-gaze Face gaze Feature communication Feature extractor Gaze estimation Mutual benefit Unified framework Win-win
会议名称5th Chinese Conference on Pattern Recognition and Computer Vision, PRCV 2022
出版地GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
会议地点Shenzhen, China
会议日期November 4, 2022 - November 7, 2022
URL查看原文
收录类别EI ; CPCI-S
语种英语
资助项目National Science and Technology Major Project from Minister of Science and Technology, China[2018AAA0103100] ; National Natural Science Foundation of China[61873255] ; Shanghai Municipal Science and Technology Major Project (ZHANGJIANG LAB[2018SHZDZX01] ; Youth Innovation Promotion Association, Chinese Academy of Sciences[2021233]
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Theory & Methods
WOS记录号WOS:001308367600043
出版者Springer Science and Business Media Deutschland GmbH
EI入藏号20224813184447
EI主题词Errors
EISSN1611-3349
原始文献类型Conference article (CA)
引用统计
正在获取...
文献类型会议论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/251941
专题信息科学与技术学院
信息科学与技术学院_特聘教授组_张晓林组
通讯作者Li, Jiamao
作者单位
1.Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai; 200050, China;
2.University of Chinese Academy of Sciences, Beijing; 100049, China;
3.Xiongan Institute of Innovation, Xiongan; 071700, China;
4.University of Science and Technology of China, Anhui, Hefei; 230027, China;
5.School of Information Science and Technology, ShanghaiTech University, Shanghai; 201210, China
推荐引用方式
GB/T 7714
Che, Hekuangyi,Zhu, Dongchen,Lin, Minjing,et al. EFG-Net: A Unified Framework for Estimating Eye Gaze and Face Gaze Simultaneously[C]. GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND:Springer Science and Business Media Deutschland GmbH,2022:552-565.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Che, Hekuangyi]的文章
[Zhu, Dongchen]的文章
[Lin, Minjing]的文章
百度学术
百度学术中相似的文章
[Che, Hekuangyi]的文章
[Zhu, Dongchen]的文章
[Lin, Minjing]的文章
必应学术
必应学术中相似的文章
[Che, Hekuangyi]的文章
[Zhu, Dongchen]的文章
[Lin, Minjing]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。