This project is a part of Janie's thesis research for a Master of Science in Computer Science and Engineering at The Pennsylvania State University.
The ART system uses a monitor-based display to convey the information to the user. This project presents an assembly sequence, real-time video with augmentation, parts graph, and assembly step descriptive video. All augmentation is done using OpenGL and VL graphics, Xview images, and MoviePlayer video with sound. The software is written in the C language. This should make it somewhat simple to port the program to other platforms.
The system uses an IRIX machine with a CAHU model 2222-2000/Z10R camera.
|The second window is the Assembly Path Window. The assembly path is shown with each step being represented as a node with brief text descriptions of the assembly steps. The user can retrieve information about each step simply by clicking the mouse on the node. By right clicking on a node, a video with sound is displayed showing the assembly step, and by left clicking on a node, the assembly step's part list is displayed. Middle clicking on the assembly step invokes the Verifier, which is described below.|
The Part Selection Window contains thumbnail pictures of the parts that can be selected to provide more detailed information about the part and its purpose. This information can help the user understand the purpose of the various parts and how they are used in the process.
The Video Window is the primary information source for the user. This window displays the real scene images as it is captured by the video camera combined with the Augmented graphics, sound, and text. The parts for each step are highlighted and lines are drawn showing the user where the parts are assembled together. An example of the Video Window is shown below the following section.
The example below shows the wheel assembly step. The wheel and base are identified by the Recognizer and Tracker, and labeled appropriately by the Augmentor based on the current assembly step. The two fiducial codes are connected by a red arrow, indicating that the two parts are related.
The Part Recognizer and Tracking component sends the coordinate locations for the individual parts to the Augmentor, and the Augmentor then identifies the individual parts for the individual process steps with a text label and draws connecting lines to the proper locations of the parts. All augmentation is controlled on a process step basis as to not confuse the user with more information than what is needed at the moment. The selection of the process step is controlled by the user clicking on a node in the Assembly Path Window.
After the system passes this first step, a visual image of what the product should look like is presented to the user so that the user can provide visual verification that the step is performed properly. If this step also passes, then the step is considered complete by the ART system. This user intervention is required because as the process progresses, more and more parts or portions of parts are occluded and because some parts such as nuts and bolts are not tracked. If the user inspection step was not included, the system would judge that the wheel was attached, regardless of whether the associated nut was holding it in place or not.
For the experiment participants included engineers, graduate students, and non-technical users who were assigned a training technique. The training techniques were a no instruction set group, used as a measure of how intuitive the assembly is; a paper manual group; an ART group with training; and an ART group without training, to see how intuitive the tool is. Each person received their training on the first trial, and then had to assemble the toy truck without assistance for four additional trials. This was done to see how effective the training method was. Each assembly went through a quality control process that explained why the assembly passed or failed inspection.
The no instruction set group had the most difficulty with the assembly process. The participants were very frustrated with trying to figure out exactly where the various pieces went. No one in this group was able to assemble the truck on the first attempt so that it passed quality control, with an average assembly time of 12 minutes, 43 seconds. By the fourth attempt, there was 100% acceptance by quality control. The fifth attempt had an average assembly time of 2 minutes, 28 seconds. Based on the difficulty this group had with the assembly, we determined that this is not a very intuitive assembly project.
The paper manual group showed significant improvement over the no instruction set group. These participants used a manual that was written based on the assembly path used by the ART system. The manual used both text and wire frame graphics. The first assembly trial for this group had an average time of 13 minutes and 40 seconds, with a 20% success rate through the quality control process. The assembly time is slightly high for this group because it took time to read the manual. By the second trial, there was a 60% success rate, and the fifth attempt had an average assembly time of 2 minutes and 9 seconds.
The ART System groups have not been completed yet at this point, but it is speculated that the initial assembly times will be slightly faster than the other groups since the assembly process should be easier, but the participants will not be familiar with the system. It is also assumed that there should be a 100% success rate through quality control since the ART system will notify the user of any errors at the point of injection.
 D. Reiners, D. Stricker, G. Klinker, and S. Muller. Augmented Reality for Construction Tasks: Doorlock Assembly. IWAR '98. November 1998.
 Eugene A. Crepeau. Production Activity Control. American Production and Inventory Control Society. 1991.
 Ronald T. Azuma. A Survey of Augmented Reality. Presence: Teleoperators & Virtual Environments. 1997.
 Rajeev Sharma, Jose Molineros. Computer Vision-Based Augmented Reality for Guiding Manual Assembly. Presence, Volume 6, Number 3. June 1997.
 Jose Molineros and Rajeev Sharma. A Minimum Spanning Tree Based Marker Coding Scheme for Visual Tracking and Recognition.
 James F. Cox III, John H. Blackstone, and Michael S. Spencer. APICS Dictionary, Eighth Edition. American Production and Inventory Control Society. 1995.
 Y. Cho, J. Lee, and U. Neumann. A Multi-Ring Color Fiducial System and a Rule-Based Detection Method for Scalable Fiducial-Tracking Augmented Reality. IWAR '98. November 1998.
 J. Molineros, V. Raghaven, and R. Sharma. AREAS: Augmented reality for Evaluating Assembly Sequences. IWAR '98. November 1998.
 Ramesh Jain, Rangachar Kasturi, and Brian Schunck. Machine Vision. McGraw-Hill Inc. New York, NY. 1995.