Poster, presented at IWAR'99, San Francisco, October 20-21, 1999

Experimental Evaluation of Augmented Reality for Assembly Training

Janie Kustaborder and Rajeev Sharma


http://www.cse.psu.edu

Abstract
Introduction
ART Project Description
Experiment and Its Outcomes
References


Abstract

Several researchers have tried to introduce Augmented Reality to the manufacturing assembly world, such as the Boeing wire bundling system [1] and the BMW Door lock assembly [2]. This work attempts to analyze the effectiveness of using Augmented Reality as a training technique for an assembly application, rather than incorporating Augmented Reality directly onto the factory floor. It describes experiments that compare the performance of training using Augmented Reality with other methods of instruction.

This project is a part of Janie's thesis research for a Master of Science in Computer Science and Engineering at The Pennsylvania State University.

top

Introduction

The basic principle of training, regardless of form, is to teach someone something. There are several different techniques including human trainers, paper instruction manuals with text and graphics, instructional video, and auditory instruction [3]. This research proposes using Augmented Reality to more efficiently accomplish assembly training.

top

ART Project Description

This section describes the Augmented Reality for Training (ART) project computer vision system that uses Augmented Reality to aid in assembly training. The system provides animation to aid in the assembly of a toy dump truck, but can be easily expanded to other projects simply by changing parameter files. During the assembly process the system should be able to detect successfully completed steps and steps that have errors.

The ART system uses a monitor-based display to convey the information to the user. This project presents an assembly sequence, real-time video with augmentation, parts graph, and assembly step descriptive video. All augmentation is done using OpenGL and VL graphics, Xview images, and MoviePlayer video with sound. The software is written in the C language. This should make it somewhat simple to port the program to other platforms.

The system uses an IRIX machine with a CAHU model 2222-2000/Z10R camera.

User Interface

The main goal of the user interface segment of the ART system is to communicate information about the assembly process and parts to the user. The interface has been kept as simple as possible to eliminate confusion with operating ART. There are four windows used to provide information to the user. The first window is the Instruction Window provided to the user.
The second window is the Assembly Path Window. The assembly path is shown with each step being represented as a node with brief text descriptions of the assembly steps. The user can retrieve information about each step simply by clicking the mouse on the node. By right clicking on a node, a video with sound is displayed showing the assembly step, and by left clicking on a node, the assembly step's part list is displayed. Middle clicking on the assembly step invokes the Verifier, which is described below.

The Part Selection Window contains thumbnail pictures of the parts that can be selected to provide more detailed information about the part and its purpose. This information can help the user understand the purpose of the various parts and how they are used in the process.

The Video Window is the primary information source for the user. This window displays the real scene images as it is captured by the video camera combined with the Augmented graphics, sound, and text. The parts for each step are highlighted and lines are drawn showing the user where the parts are assembled together. An example of the Video Window is shown below the following section.

Part Recognizer and Tracker

The Part Recognizer and Tracker component of ART has two functions. The first is that this is the component that provides the real scene to the ART system. The second is that of tracking multiple parts in real time. All parts are tracked using fiducials, special codes that are used to uniquely mark each part. Each fiducial is a asymmetric dot code with a major and minor axis. An advantage of using the fiducial codes is that it makes pose estimation simpler since the codes are in a known location. Also, using fiducial markers provides the flexibility of a virtually unlimited supply of codes and will be an advantage if the system is used for a very complex process. Because of the way the markers are positioned on the major and minor axises it is possible to detect the correct code even with some occlusion. This is very helpful since there will be times when the codes are not entirely visible when they are being manipulated by the user. This component provides the Augmentor with the coordinates associated with each part required for the various assembly steps.

The example below shows the wheel assembly step. The wheel and base are identified by the Recognizer and Tracker, and labeled appropriately by the Augmentor based on the current assembly step. The two fiducial codes are connected by a red arrow, indicating that the two parts are related.

Augmentor

The ART system uses both visual and audio augmentation to aid in the training. Photorealistic images of each part are used within the Parts Window of ART to allow the user to see exactly what the part looks like at any point during the process. Each step is associated with a video sequence showing exactly how to complete the step in the Assembly Path Window. Also, as each step goes through the verification process, the associated step box changes color and an audio tone is emitted indicating that the step was performed successfully or that an error has occurred.

The Part Recognizer and Tracking component sends the coordinate locations for the individual parts to the Augmentor, and the Augmentor then identifies the individual parts for the individual process steps with a text label and draws connecting lines to the proper locations of the parts. All augmentation is controlled on a process step basis as to not confuse the user with more information than what is needed at the moment. The selection of the process step is controlled by the user clicking on a node in the Assembly Path Window.

Verifier

The Verifier component has the responsibility for verifying that each process step is completed properly and in order. The user initiates the Verifier by clicking the middle mouse button on a node in the Assembly Path Window. The Verifier first uses traditional computer vision techniques to perform any automatic verification, such as verifying that the colors of the product are in the correct positions or that the combinations of the various fiducial markers form a pattern.

After the system passes this first step, a visual image of what the product should look like is presented to the user so that the user can provide visual verification that the step is performed properly. If this step also passes, then the step is considered complete by the ART system. This user intervention is required because as the process progresses, more and more parts or portions of parts are occluded and because some parts such as nuts and bolts are not tracked. If the user inspection step was not included, the system would judge that the wheel was attached, regardless of whether the associated nut was holding it in place or not.

top

Experiment and Its Outcomes

A verification experiment shows through time trials how an individual being trained with an Augmented Reality system can increase productivity in comparison to being trained through the use of alternative training techniques. The assembly chosen for this experiment is a Take-Apart Dump Truck, produced by Wah-Hing Toys, Ltd. The toy truck has over 30 pieces, and assembly requires the use of both a screwdriver and hex nut driver, included with the toy. The recommended age for the truck is three years and older. Both the parts and the completed truck are shown below.

For the experiment participants included engineers, graduate students, and non-technical users who were assigned a training technique. The training techniques were a no instruction set group, used as a measure of how intuitive the assembly is; a paper manual group; an ART group with training; and an ART group without training, to see how intuitive the tool is. Each person received their training on the first trial, and then had to assemble the toy truck without assistance for four additional trials. This was done to see how effective the training method was. Each assembly went through a quality control process that explained why the assembly passed or failed inspection.

The no instruction set group had the most difficulty with the assembly process. The participants were very frustrated with trying to figure out exactly where the various pieces went. No one in this group was able to assemble the truck on the first attempt so that it passed quality control, with an average assembly time of 12 minutes, 43 seconds. By the fourth attempt, there was 100% acceptance by quality control. The fifth attempt had an average assembly time of 2 minutes, 28 seconds. Based on the difficulty this group had with the assembly, we determined that this is not a very intuitive assembly project.

The paper manual group showed significant improvement over the no instruction set group. These participants used a manual that was written based on the assembly path used by the ART system. The manual used both text and wire frame graphics. The first assembly trial for this group had an average time of 13 minutes and 40 seconds, with a 20% success rate through the quality control process. The assembly time is slightly high for this group because it took time to read the manual. By the second trial, there was a 60% success rate, and the fifth attempt had an average assembly time of 2 minutes and 9 seconds.

The ART System groups have not been completed yet at this point, but it is speculated that the initial assembly times will be slightly faster than the other groups since the assembly process should be easier, but the participants will not be familiar with the system. It is also assumed that there should be a 100% success rate through quality control since the ART system will notify the user of any errors at the point of injection.

top

References

[1] Dan Curtis, David Mizell, Peter Gruenbaum, and Adam Janin. Several Devils in the Detail: Making an AR App Work in the Airplane Factory. IWAR '98. November 1998.

[2] D. Reiners, D. Stricker, G. Klinker, and S. Muller. Augmented Reality for Construction Tasks: Doorlock Assembly. IWAR '98. November 1998.

[3] Eugene A. Crepeau. Production Activity Control. American Production and Inventory Control Society. 1991.

[4] Ronald T. Azuma. A Survey of Augmented Reality. Presence: Teleoperators & Virtual Environments. 1997.

[5] Rajeev Sharma, Jose Molineros. Computer Vision-Based Augmented Reality for Guiding Manual Assembly. Presence, Volume 6, Number 3. June 1997.

[6] Jose Molineros and Rajeev Sharma. A Minimum Spanning Tree Based Marker Coding Scheme for Visual Tracking and Recognition.

[7] James F. Cox III, John H. Blackstone, and Michael S. Spencer. APICS Dictionary, Eighth Edition. American Production and Inventory Control Society. 1995.

[8] Y. Cho, J. Lee, and U. Neumann. A Multi-Ring Color Fiducial System and a Rule-Based Detection Method for Scalable Fiducial-Tracking Augmented Reality. IWAR '98. November 1998.

[9] J. Molineros, V. Raghaven, and R. Sharma. AREAS: Augmented reality for Evaluating Assembly Sequences. IWAR '98. November 1998.

[10] Ramesh Jain, Rangachar Kasturi, and Brian Schunck. Machine Vision. McGraw-Hill Inc. New York, NY. 1995.

top


This material is posted here with permission of the IEEE. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by sending a blank email message to info.pub.permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.