Laboratory of Advanced Information Networks
AR Avatars: Eye Contact and Conversations
Real-time video communication has become increasingly common with the growth of the Internet and smartphones. It is even possible to project a virtual self (avatar) in augmented reality (AR), controlling the avator from one’s physical location to appear as though one is actually in a remote location. In conversations, looking at the other person is an essential part of communication. Eye contect and gestures can have more importance than words, and this remains true when attending group meetings remotely via the network. Previous avatar-based systems for remote communication were based on one-on-one conversations, and have difficulties in group communications with multiple participants. In such cases, the system can fail to identify the particpants individually and control the avatar’s movements in a physically-convincing manner. Our laboratory is developing an AR group communication system that resolves these issues. It shares information regarding the location and position of participants between devices, allowing the avatars to interact convincingly. The avatars understand where each participant in the AR group meeting is located, and all participants can see the poses and facial expressions of each participant from their own perspective. Experimentally, this results in feelings similar to real conversations: users feel affinity for avatars that start conversations with them, and alienated from avatars that turn their back and speak with others. This system improves the realness of avatar-based communication by visually sharing this information, greatly improving the quality of group communication.