IWAR'99 Breakout Session:

Interaction and Collaboration Techniques Using Augmented Reality


Summary: Mark Billinghurst (grof@hitl.washington.edu)

This IWAR workshop discussion was divided into two parts. At the beginning of the workshop participants discussion collaborative AR systems they were aware of and some of the interaction techniques used in those systems. In the second part of the workshop there was a more focuessed discussion about research challenges that need to be addressed. This report summarizes both those discussions. If the attendees notice any mistakes or omissions please contact me so they can be corrected.

COLLABORATIVE AR SYSTEMS

The collaborative AR systems that were discussed could be categorized into two types - projective AR which uses projective displays, and HMD AR which uses optical or video see through HMDs.

Collaborative HMD AR Systems

The EMMIE system developed by Steve Feiner's research group at the University of Columbia. This supports colocated augmented reality, with users in the same physical location as well as remote user capability. User's wear see-through HMDs and sit around a conference table with a projection screen. On the table are a number of diferent display devices such as flat panel displays, PDAs and computer monitors. Users are able to move data representations from display screen to the shared AR space with a tracked stylus and interact with shared AR objects in the space. So unlike other face to face AR systems, EMMIE allows enivronmental control meaning that the AR interface could be used to transfer files between machines and devices. In the future they plan on developing adiditional Java infrastructure that will support other disparate interfaces and enable collaboration between indoor and outdoor users. This would enable collaboration between users of EMMIE and users of Columbia's mobile outdoor AR systems. A paper on EMMIE was presented at IWAR 99. More information can be found at http://www.graphics.cs.columbia.edu/graphics/

The Nara Advanced Institute of Technology (NAIST) in Japan have extended their VLEGO virtual building block application to support collaborative co-located Augmented Reality. User's sit across the table from one another and collaboratively build virtual scenes while being able to see each other and the real world at the same time. Interaction with the system is via a pair of magnetic trackers each user wears on their hands. This enables them to select graphics primitives from virtual menus of objects, apply constraints to those primitives and manipulate and interact with the objects. A unique aspect of the NAIST system is the ability of users to switch between an AR view of the scene and experience it in an immersive virtual environment. User's do this by changing their body scale. When they scale their bodies smaller or larger then the HMD switches from a see-through to occluded mode and the user experiences the virtual environment immersively. In a collaborative setting when one user is immersed and the other is using an AR interface, the AR user sees the immersed user as a virtual Avatar. An image of their system being used can be found at http://www.aist-nara.ac.jp/~kiyosi-k/work.html

The MR Systems Lab in Yokohama, Japan has developed several systems for face to face collaborative AR. AR2 Hockey is an AR version of the classic air hockey table game, while in RV Border Guards players work together to defeat virtual alien invaders. Interaction in RV Border Guards was by tracking one hand of each player and recognizing hand gestures for load, aiming and firing the virtual guns they had. In AR2 Hockey each player used a small plastic puck that had IR emitters in it. An overhead camera was then be used to track the players hands. In these systems the developers discovered that an outside view for spectators was important in allowing them to share the experience of the players. A poster about RV Border Guards was presented at IWAR 99. More information can be found at http://www.mr-system.co.jp/.

At TUWIEN, the Technical University of Vienna, researchers have been working since 1996 on the Studierstube co-located collaborative AR environment. Interaction in StudierStube is primarily through the Personal Interaction Panel (PiP) which uses a pen and pad metaphor. The user holds a pad in one hand and a pen with switches on it in the other. The pad surface provides force feedback for a variety of input devices including virtual slider bars, selection menus and dials. StudierStube is built on a sophisticated multi-tasking, multi-user architecture enabling users to run multiple applications simultaneously. More information can be found at http://www.icg.tu-graz.ac.at/~Research/Studierstube.

At ISWC 99, Nokia presented the MEX architecture and a collaborative AR application built on top of that architecture. This was an outdoor game used two wearable computers to enable players to engage in an enhanced version of "Capture the Flag". GPS systems are used to track players as they run around a forest and a wireless network is used to communicate between the wearables and a more powerful basestation computer. A paper was presented on the MEX architecture at IWSC 99.

At the Human Interface Technology Laboratory (HIT Lab) at the University of Washington, researchers have been exploring both face to face and remote collaborative augmented reality. For example, their Shared Space project uses computer vision based tracking to overlay live virtual video windows of remote users onto physical objects. In thier work they have been exploring physically based interaction in collaborative AR settings and also pen and tablet interaction. This work was presented at IWSC 99, IWAR 99 and CHI 99, as well as demonstrated at SIGGRAPH 99. More information can be found at http://www.hitl.washington.edu/research/shared_space/

Projective AR Systems

Other collaborative AR systems focus on projective AR, rather than HMD AR. At the University of North Carolina they are working on the Office of the Future where they are using computer vision techniques and multiple projectors to display virtual images on room walls and white objects. In this way users in the room can experience co-located collaborative AR, or remote collaborative AR. A paper on some of the projective techniques used was presented at IWAR 99. For more information see http://www.cs.unc.edu/Research/stc/office/index.html

Charles Robertson at Memorial University of Newfoundland, Canada, has developed a system called Live Paper. In this system computer vision techniques are used to recognize normal pieces of paper and virtual images are then projected onto the pages. In a face to face setting this type of projective AR can support co-located collaborative AR. This work was presented at MM99 See http://www.engr.mun.ca/~charlesr/

Other examples of Projective AR that support collaboration include:


RESEARCH CHALLENGES

The second part of the workshop focused on the research challenges that needed to be addressed.

Participants in the workshop felt that collaborative AR systems were unique because they combine virtual information with physical objects. Traditionally collaborative systems can be categorized into the following space and time matrix quantrants:

                    | Local | Remote
    ----------------|-------+--------
    Synchronous     |       |
    ----------------+-------+--------
    Asynchronous    |       |
                    |       |

However while information exchange can cross between quadrants (such as between local and remote people), it difficult to use physical objects in the same way. One of the research challenges that need to be addressed is how that physical objects in a collaborative AR system can be given the same flexibity to support collaboration across space and time that digital information has. So there needs to be new AR interfaces and devices developed to support smooth transitions between these quadrants.

To show how easily this transition should be, one attendee mentioned the example of a person talking on a cell phone and then walking into the same room as the person they are talking to, putting down the cell phone and seamlessly continuing on the conversation.

Attendees also felt that there needs to be a better documentation of the Collaborative Process to understand where AR interfaces could be applied to enhance it. This undertstanding could come through Ethnographic studies of real work groups, or through creating and studying artificial situations. The outcome of this should be a set of guidelines for human communication and collaboration that can be used to develop successful collaborative AR interfaces. These guidelines should be similar to the early rules of thumb developed by the Society for Information Display for visual information display (use no more than seven colors etc..) An important aspect of developing rules of thumb for collaborative AR systems is conducting rigorous user studies of collaborative AR systems.

Attendees felt that there were some issues that needed to be explored to improve synchronous collaboration, such as using a common information representation and displaying some representation of the users viewpoints. This should enable easier recognition of what each user in the system is doing.

Another topic of interest was the problem of Shared Manipulation. A collaborative AR system usually involves physical object manipulation with both haptic and visual feedback. The physical object provides a common frame of reference, but there are a number of additional challenges. For example, unlike virtual objects, it is difficult to have control over the appearance, behavior or lighting of real objects.

One final point brought up was that most of the conversation involved the visual aspects of collaborative AR systems rather than haptic, audio, or other display modalities.