World Transactions on Engineering and Technology Education  2002 UICEE Vol.1, No.2, 2002 173 INTRODUCTION In the last ten years, the field of Augmented Reality (AR) has been recognised as one of the most promising areas of computer graphics. During this time, a variety of innovative applications have been developed, stressing the importance of augmented reality in everyday life. However, only a small proportion of the applications reported in the past have been based on real time performance. Largely, this type of deficiency derives from registration problems and a lack of generating mobile computing power [1]. Thus, augmented reality remains a topic of continuing research. However, for many applications, such as cultural heritage [2], archaeology [3], military services, architecture, entertainment [4], and the use of cheap AR technology, it is possible to achieve good results, as demonstrated in the authors’ system presented below. The main objective of augmented reality technology is to superimpose computer-generated information directly into a user’s sensory perception [5], rather than replacing it with a completely synthetic environment. Augmented reality systems can improve performance times when the training is conducted on real objects [6]. Users within augmented reality systems must be able to interact with the 3D information in a natural way, like they do in the real environment. Much research is taking place on augmented reality interfaces in order to achieve this goal [7]. The findings presented in this paper are the outcome of an ongoing research project called the Virtual Interactive Teaching Environment [8]. This system was designed to enable the educator to use more sophisticated techniques and as a result to help the students to learn more effectively through the use of Virtual Multimedia Content (VMC). The contribution of this current interactive system takes into consideration both technological and educational issues. Consequently, a multimedia augmented reality interface was designed and implemented in an effective and efficient manner for both teachers and students. RELATED WORK The work presented in this article relates to and incorporates elements from different research areas: augmented reality user interfaces and virtual training. The authors’ notion of the interactive system dubbed Multimedia Augmented Reality Interface for E-Learning (MARIE) is closely related to MagicBook, a powerful augmented reality interface [9]. This system uses a real book to transfer users from reality to virtuality. Virtual objects are superimposed on the pages of the book and users can interact with the augmented scene. Another similar augmented reality generic platform for location supports direct interaction with virtual objects using a pen and a pad interface [10]. From an educational perspective, a summary of the potential benefits for the use of existing virtual environments in education and training are well defined elsewhere [11]. In addition, Computer-Aided Learning (CAL) has played an important role in engineering education and training [12]. A good educational augmented reality learning system for mathematics and geometry education is based on the Construct3D tool, where teacher and students can interact through various interactive scenarios [13]. The MARIE system mixes together virtual reality techniques and computer human interaction techniques [14]. Finally, The classroom of the future is an interesting augmented reality approach to improve education [15]. Multimedia Augmented Reality Interface for E-learning (MARIE) Fotis Liarokapis, Panos Petridis, Paul F. Lister & Martin White University of Sussex Brighton, England, United Kingdom ABSTRACT: An interactive Multimedia Augmented Reality Interface for E-Learning (MARIE) is presented in the article. Its application for engineering education is discussed in order to enhance traditional teaching and learning methods; however, it is equally applicable to other areas. The authors have developed and implemented a user-friendly interface to experimentally explore the potential of augmented reality by superimposing Virtual Multimedia Content (VMC) information in an Augmented Reality (AR) tabletop environment, such as a student desk workspace. The user can interact with the VMC, which is composed of threedimensional objects, images, animations, text (ASCII or three-dimensional) and sound. To prove the feasibility of the system only a small part of the teaching material was digitised and some experimental results are presented in the article. 174 OVERALL ARCHITECTURE The architecture of the MARIE system prototype is presented in Figure 1. It consists of a lightweight Head Mounted Display (HMD), a small camera and a computer. The system uses some of the functionality of the ARToolkit, a free distributed software C library that was designed to easily develop Augmented Reality applications [16]. This toolkit makes use of computer vision techniques to effectively compute the real camera position and orientation relative to predefined marked cards. Figure 1: Examples of trained markers. The general pipeline of the video see-through system can be explained through the following scenario. The user is equipped with a custom built see-through HMD, which is a CyVisor with a bullet camera mounted. The user observes a tabletop indoor environment (ie a typical student desk workspace). The user places a set of predefined markers on the table according to some predetermined learning context and, as the user looks at the markers through the HMD, multimedia information is mixed with the real environment in real time. DESIGN OF THE SYSTEM One of the main goals of this system was to design a lightweight system that can be easily used by inexperienced students. Hence, large and cumbersome HMDs are not used; instead, this project implemented the HMD from existing technology such as the CyVisor and simple bullet cameras. Other off-the-shelf hardware and software components were also utilised in order to keep the total budget low and also to create an ergonomic architecture that is easily reusable. The software adopted for three-dimensional modelling is 3ds max (version 4.2). Three-dimensional sound is created using a software interface to audio hardware, called OpenAL API [17]. A Pentium III processor at 700 MHz was used and a G-force IIII card as the graphics card. When the system runs at maximum resolution (768×576 pixels) then the frame rate efficiency is reduced. Therefore, a lower resolution (384×288 pixels) was used in order to achieve real time performance. Thus, the CyVisor with 800x600 pixel resolution is more than adequate. It should also be noted that the visualisation can also be viewed on a standard monitor. The teaching material is decomposed into appropriate units (future work will see these units specified in XML) and a marker was created for each one. Therefore, the users have a selection of markers associated with the teaching material (see Figure 1). The teacher now has only to devise the learning strategy. That is, the sequence in which they instruct the students to observe the markers and hence display the learning material in an AR environment, (similar to the MagicBook concept). Real World Augmented World Register and render 3D objects Decompose video into frames Detect and calculate marker position and orientation Video camera mounted on HMD PC Figure 2: Architecture of the system. 175 With teacher guidance, the student thus selects the specific marker needed for the required visualisation. As an example, a 3D representation of the Moore machine flow chart has been generated by analysing the teaching material (lecture notes). To derive the animation, the scene is rendered and exported to a predefined file format (ie moore.avi). The decoder used to compress the file was divx™ (a digital video compression technology based on the ISO MPEG-4 standard). In this way broadband performance was provided that is significantly superior to competitive technologies. IMPLEMENTATION RESULTS When the user’s view detects a marker, then the camera’s output provides input into the video capture subsystem (a winTV card), where VMC is overlaid onto the video in real time (see Figure 3). The resulting augmented visualisation generates output to the HMD and monitor as shown in Figures 4 and 5. Figure 3: MARIE’s operation. The main window represents the augmented world, where the three-dimensional objects are superimposed on the tabletop environment in real time. Three-dimensional text is also overlaid on the window acting as a help option for the user. The bottom left window plays an animation describing the object and the theory. The bottom right window displays images representing two-dimensional diagrams. In essence, MARIE augments all the required multimedia information to the users. These windows can be swapped around and can be displayed in separate windows if required. USER-TEACHER INTERACTION Users and teachers can interact with the augmented reality interface in two different ways. The users can freely manipulate the marker in three-dimensional space to receive a different perception. They can also use the mouse to zoom, pan and rotate the rendered images and animations. They can also select to play the .wav file associated with the marker using a keyboard button. On the other hand, the teacher (who does not wear an HMD) can interact with the student using the mouse or the keyboard, again in the same way. One clear advantage of MARIE is that the teacher can speak to the student at the same time that the user listens to the .wav file and the system then mixes the two sound sources together. Figure 4: Monitor augmented reality view. Figure 5: Interaction with the system. Finally, non-immersed users (those who do not wear HMDs) can collaborate with the immersed user through the monitor and voice (see Figure 5). Imagine the scenario where the teacher advises the immersed user in the same augmented workspace. CONCLUSION AND FUTURE WORK MARIE is focused on enhancing the teaching and learning process in an augmented reality e-learning environment. The main advantage of the presented augmented reality system is the low cost and real-time augmented presentation. While the system presented here is applied to engineering education, it is a generic platform applicable to a variety of other teaching and learning augmented reality environments, such as cultural 176 heritage, archaeology, surgery operations, military services, architecture and others. Future work will involve the improvement of human computer interaction techniques and the addition of other useful haptic devices to track objects in the AR scene, allow navigation and so on. The implications of such an augmented reality system could include other forms of augmentation like touch and smell. Indeed, all of the senses may be mixed with the real environment through state of the art hardware equipment. ACKNOWLEDGEMENTS Part of this research was funded by the EU IST Framework V programme, Key Action III-Multimedia Content and Tools, Augmented Representation of Cultural Objects (ARCO) project IST-2000-28366. REFERENCES 1. Azuma, R., Baillot, Y., Behringer, R., Feiner, S. and MacIntyre, B., Recent advances in augmented reality. Computers & Graphics, IEEE, November/December, 21, 6, 34-47 (2001). 2. ARCO Consortium, Augmented Representation of Cultural Objects (2001), http://www. arco-web.org 3. ARCHEOGUIDE, Augmented Reality-based Cultural Heritage On-Site Guide (2002), http://archeoguide.intranet.gr/project.htm 4. Azuma, R., A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 6, 4, August, 355-385 (1997). 5. Feiner, S.K., Augmented reality: a new way of seeing. Scientific American, 286, 4, 48-55 (2002). 6. Boud, A.C., Haniff, D.J, Baber, C. and Steiner, S.J., Virtual reality and augmented reality as a training tool for assembly tasks. Inter. Conf. on Info. and Visualization, (1999). 7. Poupyrev, I., Tan, D.S., Billinghurst, M., Kato, H., Regenbrecht, H. and Tetsutari, N., Developing a generic augmented-reality interface. IEEE Computer, 35, 3, 44-50 (2002). 8. White, M., Jay, E., Liarokapis, F., Kostakis, C. and Lister, P.F., A Virtual Interactive Teaching Environment (VITE) using XML and augmented reality. The Interactive J. of Electrical Engng. Educ. (2001). 9. Billinghurst, M., Kato, H. and Poupyrev, I., The MagicBook: a traditional AR interface. Computer and Graphics, 25, 745-753 (2001). 10. Reitmayr, G. and Schmalstieg, D., A wearable 3D augmented reality workspace. Proc. 5th Inter. Symp. on Wearable Computers (ISWC'01) (2001). 11. Mantovani, F., VR Learning: Potential and Challenges for the Use of 3D Environments in Education and Training. In: Riva, G. and Galimberti, C. (Eds), Towards CyberPsychology: Mind, Cognitions and Society in the Internet Age. Amsterdam: IOS Press (2001). 12. Kluj, S., The potential of Computer Aided Learning and its impact on marine engineering education and training. Proc. 3rd Global Congress on Engng. Educ., 342-344 (2002). 13. Kaufmann, H. and Schmalstieg, D., Mathematics and geometry education with collaborative augmented reality. Proc. SIGGRAPH 2002 Conf. Abstracts and Applications, (to be published). 14. Grasset, R. and Gascuel, J-D., MARE: Multiuser Augmented Reality Environment on table setup. Proc. ACM SIGGRAPH Conf. Abstracts and Applications (2002). 15. Cooperstock, J.R., The classroom of the future: enhancing education through augmented reality. Proc. HCI Inter. 2001 Conf. on Human-Computer Interaction, New Orleans, USA, 688-692 (2001). 16. HIT Lab, Augmented and Mixed Reality, AR Toolkit. Seattle: University of Washington (2002), http://www.hitl.washington.edu/people/poup/research/ar.ht m#artoolkit. 17. Loki Entertainment Software, OpenAL Specification and Reference. Tustin: Loki Entertainment Software (2000), http://www.openal.org/home/