BiReality uses a teleoperated robotic surrogate to visit remote locations as a substitute for physical travel. Our goal is to create to the greatest extent practical, both for the user and the people at the remote location, the sensory experience relevant for business interactions of the user actually being in the remote location. Our second-generation system provides a 360-degree surround immersive audio and visual experience for both the user and remote participants, and streams eight 720x480 MPEG-2 videos totaling almost 20Mb/s over 802.11a wireless networking. The system preserves gaze and eye contact, presents local and remote participants to each other at life size, and preserves the head height of the user at the remote location. This talk focuses on some of the system challenges inherent in the project, and includes a short video demonstration.
Norman P. Jouppi is currently a Fellow at HP Labs in Palo Alto, California. He received his PhD in Electrical Engineering from Stanford University and joined Digital Equipment Corporation's Western Research Lab in 1984. From 1984 through 1996 he was also a consulting assistant/associate professor in the department of Electrical Engineering at Stanford University. He was the principal architect of four microprocessors, and also contributed to the design of several graphics accelerators. His current research interests include audio, video, and physical telepresence as well as computer systems architecture.