11/27/2016 1 PA198 Augmented Reality Interfaces Lecture 10 Augmented Reality Applications Fotis Liarokapis 28th November 2016 Archaeology An Augmented Reality Interface for Visualizing and Interacting with Virtual Content Liarokapis, F. An Augmented Reality Interface for Visualizing and Interacting with Virtual Content, Virtual Reality, Springer, 11(1): 23-43, 2007. Augmented Reality Interface Toolkit AR Applications Interface framework Interaction algorithms Audio algorithms based on OpenAL API Visual algorithms based on OpenGL API Vision SDK Video Libraries Windows OS Libraries and Drivers (Graphics and Sound Card, Video Camera, etc) ARToolKit tracking Libraries Input Hardware DevicesComputer HMD Display Plasma Screen Display Video Splitter Web Camera Marker Cards Flat Screen Display Augmented Reality Interface Toolkit Liarokapis, F., White, M., Lister, P.F. Augmented Reality Interface Toolkit, Proc. of the International Symposium on Augmented and Virtual Reality, IV04-AVR, IEEE Computer Society, London, 761-767, 2004. (ISBN: 0-7695-2177-0) Layers Interactions Input Hardware Devices Laptop HMD Display Plasma Screen Video Splitter Web Camera Marker Cards Flat Screens Augmented Reality Environment Touch Screen Interface Visualisation Capabilities • Visualisation – 3D models – 3D text – 3D sound – Videos – Pictures • The GUI consists of: – Menu – Toolbar – Status bar – Dialog boxes Liarokapis, F., Augmented Reality Interfaces - Architectures for Visualising and Interacting with Virtual Information, Sussex theses S 5931, Department of Informatics, School of Science and Technology, University of Sussex, Falmer, UK, 2005 11/27/2016 2 Broken Artefact Augmentation ARCO-An Architecture for Digitization, Management and Presentation of Virtual Exhibitions White, M., Mourkoussis, N., et al. ARCO-An Architecture for Digitization, Management and Presentation of Virtual Exhibitions, Proc. of the 22nd International Conference on Computer Graphics (CGI'2004), IEEE Computer Society, Hersonissos, Crete, June 16-19, 622-625, 2004. The ARCO Project (EU) ARCO System Architecture Content Production Database ACMA Designing Virtual Exhibitions Acquisition Object Modeller Object Refiner Web + VR Presentation Web + AR Presentation Content Management Content Visualization XDE Data Exchange White, M., Mourkoussis, N., et al. ARCO-An Architecture for Digitization, Management and Presentation of Virtual Exhibitions, Proc. of the 22nd International Conference on Computer Graphics (CGI'2004), IEEE Computer Society, Hersonissos, Crete, June 16-19, 622-625, 2004. (ISBN: 0-7695-2171-1) Interactive Visualisation Interface «executable» Web browser «library» Communication interface «library» MSXML wrapper «library» WinInet wrapper «executable» Interactive visualisation table-top (AR) «library» AR Toolkit «library» OpenGL «library» SpaceMouse File system «document» Cache «document» AR data XML Configuration «file» XML config file «library» MSXML wrapper «library» XMLConfig wrapper «document» xml_config «document» runtime_config Liarokapis, F., Sylaiou, S., et al. An Interactive Visualisation Interface for Virtual Museums, Proc. of the 5th International Symposium on Virtual Reality, Archaeology and Cultural Heritage, Eurographics Association, Brussels, Belgium, 6-10 Dec, 47-56, 2004. (ISBN: 3-905673-18-5) 11/27/2016 3 Interactive Visualisation AR If camera detection and initialisation is successful Download AR resources from the server Disable AR table-top NoClick Free used markers and associate AR objects with available markers If marker detection is successful Draw AR objects on visible markers Yes Yes Liarokapis, F., Sylaiou, S., et al. An Interactive Visualisation Interface for Virtual Museums, Proc. of the 5th International Symposium on Virtual Reality, Archaeology and Cultural Heritage, Eurographics Association, Brussels, Belgium, 6-10 Dec, 47-56, 2004. (ISBN: 3-905673-18-5) Visualisation Techniques Liarokapis, F., Sylaiou, S., et al. An Interactive Visualisation Interface for Virtual Museums, Proc. of the 5th International Symposium on Virtual Reality, Archaeology and Cultural Heritage, Eurographics Association, Brussels, Belgium, 6-10 Dec, 47-56, 2004. (ISBN: 3-905673-18-5) Augmented Reality Exhibitions • The AR application integrates two components: – Web browser – AR browser AR Artefact Visualisation, Victoria & Albert Museum Liarokapis, F., Sylaiou, S., et al. An Interactive Visualisation Interface for Virtual Museums, Proc. of the 5th International Symposium on Virtual Reality, Archaeology and Cultural Heritage, Eurographics Association, Brussels, Belgium, 6-10 Dec, 47-56, (2004). ISBN: 3-905673-18-5. Fishbourne Roman Palace Video Magic Book ARCO Video Navigation 11/27/2016 4 Evaluation of a Mobile MR Geovisualisation Interface Liarokapis, F. Evaluation of a Mobile MR Geovisualisation Interface, Proc. of the 29th annual conference of the European Association for Computer Graphics (EUROGRAPHICS 2008), Eurographics, Crete, Greece, 14-18 April, 231-234, 2008. Introduction • Need for advanced geographical visualization systems – Majority of systems make use only of 2D images/maps to generate user views – Geographers are slowly moving from 2D to 3D • Existence of advanced visualization techniques – More user-friendly interfaces – Different technologies ranging from VR, AR to MR – Digital mapping offers the ability to use 3D including latitude, longitude and elevation • Limited user-studies Challenges • Cognitive challenges – An order of magnitude greater than the problem solved in the contemporary systems on the market • i.e. GoogleEarth, Virtual Earth 3D, Satnav, etc • Some of the constraints include: – Size/ power/ weight limits – Diversity of landmarks required – Problem of orientation – Wide spatial ability over a range of users Important Issues • Determine important issues for geovisualisation – Extensions to navigation • Explore the following: – Appropriate visualisation domain – Types of geographical information – Best medium of interacting System Architecture Video 11/27/2016 5 Expert User Evaluation • Six expert users – Geography, Geovisualization, Mixed reality, information retrieval, human-computer interaction and psychology – Average time was 30 minutes – Aged between 30 to 50 years old • Tested: – General issues about pedestrian navigation – Testing of four hypotheses for virtual navigation Issues about Pedestrian Navigation • Gather user requirements regarding to some of the issues related to urban navigation such as: – User, environmental and external • User-related requirements – Speed (smooth or slow speed is preferred), FOV, Eye-level of view (not the bird’s eye perspective), position of the user and navigator, orientation (relates to the FOV) • Environmental requirements – Road signs (when navigating in unfamiliar environments), mode of travel and the style of navigation, landmarks in combination with visual cues • External requirements – Use paper maps (2D maps) for urban navigation, (i.e. A-Z maps) but reading paper maps is difficult where digital maps save time Hypotheses Testing a. Textured vs. non-textured 3D map b. Correct vs. wrong vs. landmark textures c. High resolution vs. low resolution textures d. High-detail vs. mediumdetail vs. low detail Textured vs. non-textured 3D map • Use of textures is an appropriate way for visualization in virtual environments – Using textures is more appreciated than using no textures at all – Did not like the 3D map without textures because the textured model was more realistic – When navigating in the non-textured model they would not know where they were if they hadn’t seen the textured model earlier Correct vs. wrong vs. landmark textures • Felt there was a difference between these models but could not work what was the difference! – They did not recognize if the textures were wrong – The result of this task shows that the mixture of realistic objects (textures) and abstract objects (non textures) does not make the model pleasant to use and use of landmarks aids navigation – All the participants agreed that landmarks are very important to virtual navigation High resolution vs. low resolution textures • Did not feel any difference between high and low resolution models – The speed is important and if the level of speed is constant, high resolution could be chosen as better option – It would be better if high resolution could be used in a 3D environment but is not necessary 11/27/2016 6 High-detail vs. medium-detail vs. low detail • Although they preferred the full detailed model they generally did not like having too much detail – Preferred less but more realistic representation of the environment – Level of detail was sufficient and the virtual representation gave them a good idea about the real environment – The use of textual annotations that indicated buildings name was effective – It was also pointed that the extra street detail would be useful if used to distinguish one location from another End-User Evaluation • Determine user needs during geovisualization and spatial exploration • 30 users – 15 male and 15 female – Aged 18 to 43 • Users assessed in different issues: – Satisfaction, interaction and comparison of Navigation Aids • Two questionnaires – Open ended questions – Scaling and multi-choice type questions Satisfaction VR (M = 4, SD = 0.94686, SE = 0.17287) AR (M = 2.8, SD = 0.8469, SE = 0.15462) MR (M = 2.8, SD = 1.18613, SE = 0.21656) Comparison of Navigation Aids 3D map (M = 4.2, SD = 0.8469, SE = 0.15462), text (M = 3.96667, SD = 1.18855, SE = 0.217), 2D map (M = 3.76667, SD = 1.19434, SE = 0.21805), 3D sound (M = 3.5, SD = 1.3834, SE = 0.25257) Interaction marker cards (M = 4.1, SD = 0.99481, SE = 0.18163) keyboard (M = 3.8, SD = 0.88668, SE = 0.16189) menu interface (M = 3.16667, SD = 1.17688, SE = 0.21487) Follow-up Prototype • Based on PDA • Different tracking – GPS & Compass • Continuous development & evaluation 11/27/2016 7 A Mobile Framework for Tourist Guides Liarokapis, F., Mountain, D. A Mobile Framework for Tourist Guides, Workshop on Virtual Museums, 8th International Symposium on Virtual Reality, Archaeology and Cultural Heritage (VAST '07), Eurographics, Brighton, UK, 26-30 November, 2007. Introduction • Heritage visitors often use electronic equipment – Sound players, special headsets or personal computers equipped with multimedia applications • Open-air heritage sites are expected to have a tremendous growth in the coming years – Mainly because of the advances in ubiquitous computing • Mobile ubiquitous computing has contributed to the development of an emerging class of services – Known as LBS • Provides information in relation to the spatial location of the receiver Mobile Computing • Mobile devices present opportunities and constraints – When extending map-based interface in a digital environment • The major constraint is space – Displays are typically less than 6x10 cm • Need to reduce the volume of information – Still further than for the case of the physical sign • An opportunity is the ability to filter information according to the personal preferences – To display only information that they are interested in Location Aware Services • Display just necessary information – About how industrial uses of the land have affected the area though history • Highly personalised maps – Presenting one type of information at the expense of everything else • Location-aware interfaces Mobile Framework • The client-server mobile framework uses the capabilities of the Camineo Guide – Specifically designed to provide digital guides for tourism – A user-friendly traveling companion which reveals the richness of an area of interest – Portable and small, it is the same size as a Pocket PC or last generation mobile phone Development of LOcation Context Tools for UMTS Mobile Information Services (LOCUS) • LOCUS allows users to switch rendering modes: – The traditional digital map guide – A virtual reality guide – An augmented reality guide • This allows tourists to select the most appropriate presentation type according to their needs • Provides both position and orientation real-time tracking in mobile devices – Operating anywhere and anytime 11/27/2016 8 Mobile Heritage System • Main objective is to provide advanced LBS to users delivered through a web-browser interface – i.e. Pocket Internet Explorer • In terms of functionality, it offers: – Position and orientation information – ‘Mobile search' options based on geo-referenced spatial database – Routing tools are developed to provide advanced navigational assistance to mobile users • Based upon the experience of previous users – Can suggest different routes depending on whether the journey is to be taken Variation in Temporal Proximity • Calculate speed of movement for the edges of a transportation network – Speeds reflect the spatial behaviour of individuals 6am: least constrained 9am: more constrained 12pm: most constrained Multimodal Mobile Guide Visualisation • Diverse candidate interfaces for the presentation of spatially referenced information on mobile devices • Opportunity of presenting spatially referenced information relative to the device user’s current location • Three alternative interfaces have been developed for the presenting information in situ including: – 2D map interface – Virtual reality map interface – Augmented reality map interface Spatial Localisation • To provide effective location-based services, the continuous tracking of pose – Position (x, y and z) – Orientation (yaw, pitch and roll) • Two different approaches were tested – A sensor-solution – A computer vision approach Computer Vision Solution • Based on the calculation of features belonging in the real-environment – Road-signs – Parts of buildings • However – Very limited range of operation – Not a robust solution for unknown environments 11/27/2016 9 Sensor Solution • Sensor-based solutions are more robust: – Can work everywhere, anywhere and at anytime – Provide sufficient accuracy for real-time navigation and wayfinding applications • Sensor-based interface is designed to support mobile heritage guides through: – Self-localization – Orientation – POI identification Position Tracking • In the initial prototype, Bluetooth GPS based on the SIRFStar III chipset have been used due to: – Small size – Long battery life – Massive correlation power • In the final prototype, PDAs with assisted GPS embedded were used – i.e. HP 6915 PDA, Mio A701 Holux GPSlim 236 vs Pharos Orientation Tracking • Integration of a digital compass is not easy – Hardware is not packaged for consumer use • Software integration needs to be implemented for the specific device • Digital compass hardware was connected to a cordless Bluetooth serial adaptor • A tool has been written to read the data strings from the selected COM port Honeywell HMR3300 2D Map Interface • In the pre-digital age, signage often showed maps with ‘you are here’ arrows indicating the location of the sign – Location of specific features of interest was often shown on the map • Sign designers would often have to decide upon the most pertinent information to display – At the expense of features considered less important • These maps displayed a wide range of information from basic utilities to cultural information – i.e. Location of the most popular site attractions Digital Map Mobile Interface • The digital map interface includes four different representations of maps with different resolutions – Can be accessed from the ‘-’ and ‘+’ buttons of the interface • A coloured circle represents the position of the visitor and the arrow the orientation • Visitors can interact with the map by selecting one of the eight arrow buttons using the stylus of the device Virtual Reality Map Interface • An alternative to the map interface on the mobile device is to use a mobile VR interface – Realistic 3D representation of the map interface • Both maps and virtual scenes are abstractions of reality • Whilst maps adopt an allocentric (bird’s eye) perspective on a 2D abstraction, VR adopts an egocentric perspective on a 3D abstraction • This perspective aims to mirror a visitor’s perspective of a cultural heritage site 11/27/2016 10 Virtual Reality Mobile Interface • This virtual space can present further relevant information to visitors – Labels (such as building names) may be attached to objects in the scene, – Directions may be presented for a walking tour of a site – Anchor points may allow users to click through to reveal further background information about a feature Augmented Reality Map Interface • AR aims to combine real and virtual information within a single view • In the past we have developed an prototype for UMPC that allows for the combination of 4 different types of information – 3D maps, 2D maps, text and spatial sound Augmented Reality Mobile Interface • However for PDA-based tourist guide applications textual information usually suffices – Orthogonal projection – Perspective projection Swiss Park Guide • A location-aware mobile guide was implemented in the Swiss National Park – Switzerland’s only National Park, and is a site of great natural and cultural importance • Visitors are attracted to the – Dramatic, mountainous scenery – Rare flora and fauna – Study the history of human influence on the park • The map was chosen as the primary interface for the system – All information stored in the information database was spatially-referenced Personalisation • To allow visitors to general personalised maps, information was structured according to activities: – Visitors participated in (e.g. hiking) – Frequently asked questions about facilities and access (service / tips) – Visitors’ main interests (fauna and flora, habitats) – Other themes, based upon current exhibitions at the park • Visitors could generate a bespoke map based upon their personal interest and current location – The locations of the sites of lime kilns in the park – The extent of the range of red deer – Places where the flower, edelweiss, is likely to be found Personalisation Example Distribution of red deer (darker brown, more likely to find red deer), relative to user’s current location (the red cross) 11/27/2016 11 Swiss Park Evaluation Results • An evaluation study was conducted with 87 participants – Overall the user reaction to the guide was very positive – Considering the quality of the information presented by the device, 3/4 rated this as either ‘very good’ or ‘good’ – Some 2/3 considered the ease of information provision to be ‘very good’ or ‘good’ • The approach of personalising the information presented to the user by filtering it according to the user’s position appears to have been a valued strategy: – Over 40% of participants considered this filtered information retrieved to be ‘very relevant’ Swiss Park Evaluation Results . • The ‘search around me’ filter was provided the most relevant information with 2/3 of the respondents saying that this provided ‘extremely relevant’ results, – Over 90% claiming that results were either ‘extremely relevant’ or ‘relevant’ • The ‘search ahead of me’ filter also performs well (89%) – Claiming that results were either ‘extremely relevant’ or ‘relevant’, and half claiming that the results were ‘extremely relevant’ • Presenting information over a map appeared to meet the needs of visitors: – 85% of participants found the maps showing information retrieved following personalised searches to be ‘beneficial’ or ‘very beneficial’ City University Guide • Another customisation that was performed is City University Guide • A multimodal mobile guide was designed – Map, VR and AR domains • The GPS is embedded on the mobile device while the digital compass is a separate component hidden inside a blue cylinder Urban Mobile Guide • Pedestrians can navigate intuitively within the real environment using both position and orientation information on a mobile virtual environment – Dynamic switching of camera viewpoint from the pedestrian view to a birds-eye view can be accessed from the menu buttons Virtual Pointer • The digital compass can be also used as a virtual pointer to provide information about the surroundings such as: – ‘What is the name of the building?’ – ‘How far it is located from me?’ AR POI Textual Presentation 11/27/2016 12 LOCUS Video Discussion • Mobile interfaces can take advantage of different visualisation and tracking technologies to provide efficient support for tourist guides through: – Traditional digital maps – VR 3D maps – AR assistance • Future work: – Developing further the interfaces • Multimedia AR – Plan to perform an evaluation comparing the advantages and disadvantages different presentation domains Examining User Experiences in a Mobile Augmented Reality Tourist Guide Střelák, D., Škola, F., Liarokapis, F. Examining User Experiences in a Mobile Augmented Reality Tourist Guide, Proc. of the 9th International Conference on PErvasive Technologies Related to Assistive Environments (Petra 2016), ACM Press, Corfu Island, Greece, 29 June - 1 July, Article No. 19, 2016. Aim • The aim of this work was to design, implement and evaluate an AR touristic guide for mobile devices – Must present interactive historical information to tourists – As a case study the historic city centre of Brno (Czech Republic) was selected Modeling of Church • City centre of Brno, main square a church used to be there – St. Nicolas church was modeled in 3D – Using old photographs and plans – 7400 polygons Original information (a) church outline in the pavement (b) perspective grid to estimate sizes, (c) church blueprint 11/27/2016 13 Natural Feature Tracking Feature tracking (left) without verification line, (right) with verification lines behind 3D cube Manual Pose Adjustment Found Features and Descriptors Sensor Solution • The sensor AR solution is using two types of sensors: – Rotation vector – Combination of accelerometer and geomagnetic sensor • Events generated by the latter sensors, contain raw information without any pre-processing or filtering – Linear interpolation between previous value and new value is used to smooth the data Video • https://dl.dropboxusercontent.com/u/567914 85/petra.mp4 Computer Vision Detection Time 11/27/2016 14 GPS vs. Computer Vision Accuracy User Evaluation • 30 healthy participants – 15 males, 15 females • Two questionnaires – Cognitive workload – Presence Qualitative Evaluation • In general easy to use and interesting application • However some comments: – The design of the application needs to be changed to be more intuitive – Some form of interactive tutorial should be embedded directly into the application, explaining its controls Quantitative Evaluation Overview of significant correlations Significant correlations Results • Results showed that users found the sensor approach easy to use and intuitive • The majority reported fast adaptation to the AR application • As far as gender differences are concerned, females were more satisfied with the AR experience compared to males and also reported higher temporal demand • Overall, feedback showed that AR technology has all the potential to be used for tourist guides since it is easy to use and intuitive 11/27/2016 15 Environmental Monitoring Augmented Reality Environmental Monitoring Using Wireless Sensor Networks Goldsmith, D., Liarokapis, F., Malone, G., Kemp, J. Augmented Reality Environmental Monitoring Using Wireless Sensor Networks, Proc. of the 12th International Conference on Information Visualisation (IV08), IEEE Computer Society, 8-11 July, 539-544, 2008. Introduction • Research within the WSN community has led to the development of new computing models – Ranging from distributed computing to large-scale pervasive computing environments • This rapid evolution of pervasive computing technologies has allowed the development of novel interfaces – Capable of interacting with sensory information originating from the environment with little or no manual intervention • Although a number of technologies are able to perform natural interactions, pervasive AR is one of the strongest candidates Augmented Reality • AR allows for seamless integration of virtual and real information in real-time • An efficient AR system consists of: – Real-time video capture, high-speed image processing, virtual world handling, registration and realistic rendering • Pervasive AR interfaces can support sensor networks to capture – Sound, temperature, etc SensAR • An environmental monitoring prototype that uses WSN to gather temperature and audio data about the user’s surroundings • SensAR displays the environmental information in an understandable format using a real-time handheld AR interface • Participants can visualise 3D as well as textual representations of the sound and temperature information in a tangible manner System Architecture 11/27/2016 16 Hardware • Sensors - Gumstix Verdex XM4-bt – Intel XScale PXA270 400MHz processor, 16MB of flash memory, 64MB of RAM, a Bluetooth controller and antenna, 60-pin and 120-pin connectors for expansion boards, and a further 24-pin flex ribbon connector • Visualisation - VAIO UX UMPC – Intel® Core™ Solo Processor at 1.3MHz, wireless 802.11a/b/g, 32GB hard drive, 1GB SDRAM, 4.5" touch panel LCD, a Graphics Accelerator and 2 built-in digital cameras Software • A generic interface to the I2C bus has been implemented to allow access to data from the sensors – The API supports other I2C enabled devices such as digital compasses, pressure sensors, accelerometers, and light meters • The framework supports a range of communication protocols, offering the choice of: – Bluetooth, WiFi and Ethernet based data transfer • The client framework is based on: – VRML for 3D environmental representations – OpenGL API for the graphics rendering – GLUT API for textual augmentations – Tracking of the user is based on ARToolKit Method of Operation • Users can navigate inside the room by moving the UMPC and detecting different markers • SensAR checks each video frame for predetermined patterns – Current version uses 12 patterns • The markers are placed so that the centre of the pattern is halfway up the height of the wall • For each marker different sound and temperature sensors are attached Best Marker Detection • Choosing the best marker gives the ability to detect each marker – Ultimately use the one which is currently easiest to detect – This is done using marker confidence levels • For every frame the application – Iterates through each marker that is in sight – Reads the confidence level of that marker – Stores the highest one Handheld AR Interface • A handheld AR interface has been implemented to allow users to experience the environmental information gathered • Sensors collect sound and temperature level data at various points in space and relay this information to SensAR • A user interface is then used to seamlessly superimpose computer generated representations of sound and temperature based on the readings of these sensors Handheld AR Interface . • One of the versions of our program also includes a 3D representation of the entire room • This is projected over the real room in AR environment 11/27/2016 17 Environmental Data Visualisation • There is an open issue of how to visually represent environmental data coming from the WSNs • One of the aims of this work was to select an appropriate metaphor to assist users in rapid interpretation of the information • After some informal evaluation, it was decided to represent the environmental information through the use of – a 3D thermometer – a 3D music note Environmental Data Visualisation . Low Levels of Sound and Temperature High Levels of Sound and Temperature Video SensAR Environmental Monitoring Video https://www.youtube.com/watch?v=lgVGIahHM8o Discussion • SensAR, a prototype mobile AR system for visualising environmental information including temperature and sound data was presented • Sound and temperature data are transmitted wirelessly to our client which is a handheld device • Environmental information is represented graphically as 3D objects and textual information in real-time based AR • Participants visualise and interact with the augmented environmental information using a small but powerful handheld computer Future Work • Integrate more sensors including – Light, pressure and humidity • Improve immersions – Head-mounted display that includes orientation tracker • Improve interaction – Digital compass, a VR glove and the Wii controller • Evaluation – Extensive user studies 11/27/2016 18 Games A Pervasive Augmented Reality Serious Game Liarokapis, F., Macan, L., Malone, G., Rebolledo-Mendez, G., de Freitas, S. Multimodal Augmented Reality Tangible Gaming, Journal of Visual Computer, Springer, 25(12): 1109- 1120, 2009. Introduction • The use of serious games in virtual worlds – Opens up the possibility of defining learning gamebased scenarios – Enables collaborative or mediated learning activities that could lead to better learning – Engage learners with these in a multimodal fashion (i.e. using different senses) helping learners to fully immerse in a learning situation – Multimodal nature shares resources, spaces and ideas Pervasive AR Gaming • Pervasive games can have educational aspects and the whole idea of playability in pervasive games is: – Player’s interaction with the physical reality – Accessibility space that is the key to the oscillation between embedded and tangible information • AR refers to the seamless integration of virtual information with the real environment in real-time • AR interfaces have the potential of enhancing ubiquitous computing environments – By allowing necessary information to be visualized in a number of different ways depending on the user needs Motivation • The main objective of the research is to design and implement generic pervasive interfaces that are user-friendly and can be used by a wide range of users including people with disabilities – A pervasive AR serious game that can be used to enhance entertainment using a multimodal tracking interface – An AR pipes game is under development Architecture 11/27/2016 19 Multimodal Interaction • The main objective of this work was to allow for seamlessly interaction between the users and the superimposed environmental information – Hand tracking – Head orientation – Wiimote interaction – Pinch glove – UMPC Multimodal Interaction Examples AR Racing Game • The main aim of the game is to move the car around the scene using the Wimote without colliding with the other objects or the fountain • Virtual objects can be rearranged in real-time by picking and dropping them anywhere in the arena AR Picking AR Occlusion • Once the hand moves to interact with the AR game it obscures a real-world visual marker – Multiple marker tracking based on a confidence rating was implemented to represent a single object or set of objects • The superimposed graphics would be drawn over the camera feed regardless of the real world objects and their position – The game uses a 3D model of a hand and places it over the known position of the user's hand, based on the marker position – Depth test algorithmscheck if it is closer to the camera than the other models in the scene AR Firing • By making a predetermined gesture, detected by the glove, a user is able to fire a virtual projectile into the scene • Determine the direction a projectile would travel from the hand and whether it would intersect with other objects in the scene 11/27/2016 20 Collision Detection • Using bounding boxes around each of the items in the scene and one to encompass the user's hand, intersection testing was used as a simple method of collision detection – The car is not able to cross the boundaries of the game board and its progress is impeded by the other obstacles • Collision detection also enables the program to determine when the user's hand in the real world is intersecting with an object in the virtual scene Spatial Sound • Spatial sound was superimposed to enhance the level of immersion – Examples of sound sources defined include ‘engine’ and ‘collision’ • As the car is started and directed away from the centre of the camera's view the sound of its engine gradually reduces in volume – Depending on the direction of movement taken the balance of stereo sound is altered accordingly • The volume is relative to the positions of both the camera and the ‘engine’ and/or ‘collision’ Video Racing Car https://www.youtube.com/watch?v=k3r181_GW-o Video PileAR https://www.youtube.com/watch?v=0xPIpinN4r8 Initial Evaluation • Up to now the game has only been qualitatively evaluated in two demonstrations at: – Cogent Computing Applied Research Centre – Serious Games Institute • Tested based on ‘think aloud’ evaluation technique Cogent Demonstration • Two tasks were examined: – Wiimote interaction • The feedback received was positive from all users • Although it is possible to detect the yaw rotation against one area, it is impossible to identify different IR sensors placed around the whole environment – Pinch glove interaction • All users agreed that it is very intuitive to perform the preprogrammed operations • Some felt the test could incorporate more visualisation techniques, such as changing colours and the ability to activate / deactivate textual augmentations. • UMPC interaction is tiring with one hand only 11/27/2016 21 SGI Demonstration • Demonstrated in an internal event with 20 visitors • Initial feedback received stated that the game is very realistic in terms of interactions and enjoyable to play – The idea of picking virtual objects and placing them in arbitrary positions was very enthusiastically received. – Most visitors felt that tangible games presented potential for the next generation of gaming • On the negative side, users: – Preferred to experience a more complete gaming scenario including a score indicating successful achievements – Requested more objects in the scene (i.e. obstacles), multiplayer capabilities (i.e. more racing cars) and more tracks with different levels of difficulty Pile AR Game Initial Setup Game in progress Pilot Evaluation • 9 users Enjoyment of the tangible interaction method vs. the keyboard interaction Level of enjoyment of the keyboard and tangibly controlled versions of the game Discussion • Pervasive computing has the potential to change a number of applications that we perform in our day-to-day activities • A generic pervasive AR gaming system was presented allowing users to interact using a pinch glove, a Wiimote, through tangible ways as well as through I/O controls of the UMPC • Initial evaluation results were very encouraging and provided us with useful recommendations for future work Future Work • More games will be designed – Another prototype is under development • Integration of more interaction sensors – i.e. 3DOF orientation devices • Graphical user interface more user-friendly • More complex spatial sound system • Speech recognition to enhance the usability of interactions Music 11/27/2016 22 Augmented Reality Scenarios for Guitar Learning Liarokapis, F. Augmented Reality Scenarios for Guitar Learning, Proc. of Theory and Practice of Computer Graphics 2005, Eurographics UK Chapter, Eurographics, University of Kent, Canterbury, UK, 15th-17th June, 163-170, 2005. Introduction • An experimental AR system capable of simulating and superimposing 3D audio and 3D visual information is presented • The aim of this work is to teach the basics of electric guitar in a more efficient way than the traditional methods through the use of a prototype AR interface toolkit • Learners are not required to have previous musical experience although they must be familiar with a computing environment Operation of the System Windows API (MFC lib) Windows API (MFC lib) 3D Sound API (OpenAL lib) 3D Sound API (OpenAL lib) 3D Graphics API (OpenGL lib) 3D Graphics API (OpenGL lib) Optical Tracking (ARToolKit lib) Optical Tracking (ARToolKit lib) Video (Vision SDK lib) Video (Vision SDK lib) File System File System Compute camera position & orientation Audio-visual information Visual augmentation Audio augmentation Open video stream and perform video operations Co-ordinates the audiovisual augmentation System Architecture Library Architecture Teaching Material NotesNotes ChordsChords Teaching Material Teaching Material Digital Information Digital Information Tutorial Implementation Tutorial Implementation Marker Generation Marker Generation TheoreticalTheoretical PracticalPractical PicturesPictures TablaturesTablatures ScalesScales TextText 3D Model3D Model SoundSound Audio Architecture Audio Data File System 3D sound software 3D audio processing Sound mixing Audio sources Audio Subsystem Control Subsystem Visualise 3D sound Transform 3D sound Input Output Speakers Change attributes Manipulate sources AR Scenarios Interactions with the AR scene Textual augmentation of guitar parts Example of a theoretical scenario Example of a practical scenario 11/27/2016 23 Augmenting Lyrics and Chords AR Music Video 1 AR Music Video 2 AR Music Video 3 Pilot Evaluation • 9 user evaluation Conclusions • Numerous applications exist – Today we illustrated some of them • Will see more robust applications in the near future – Hardware technology is there – More software solutions are required 11/27/2016 24 Questions