Travel and Wayfinding Techniques Source & credits: Prof.S.Feiner, Columbia U., N.Y. Dr. L.Deligiannidis, U. of Georgia, GA J.Sochor & students of HCILab, MU, CZ 2 PV182: Human Computer Interaction, 02.12.2018 What is travel? • the motor component of navigation • movement between 2 locations, setting the position (and orientation) of the user’s viewpoint • the most basic and common VE interaction technique, used in almost any large-scale VE 3 PV182: Human Computer Interaction, 02.12.2018 What is travel? Travel refers to the actual movement through space, and not the planning, decision-making, or cognitive aspects (wayfinding). Can we really separate these 2 tasks? You can have travel techniques which do not address wayfinding, but the best travel techniques will integrate travel techniques with aids to wayfinding. 4 PV182: Human Computer Interaction, 02.12.2018 What is travel? We normally only consider setting the viewpoint position when designing travel techniques for immersive VEs, since orientation will be taken care of with head tracking. In non-immersive situations, travel techniques must consider orientation, making them more complex (e.g. navigation in a VRML browser). Travel is truly a universal interaction task, except in VEs in which all user interaction is local. 5 PV182: Human Computer Interaction, 02.12.2018 Start to Move Indicate position – Specify Position • discrete target specification (select object in environment, from list, position 3D cursor, automatic selection, …) • one-time route specification (markers, curvature, distance, …) • continuous specification (gaze directed, pointing, physical steering, virtual controls, 2D pointing – Specify Velocity – Specify Acceleration Indicate orientation Stop moving Travelling - Level of user control 6 PV182: Human Computer Interaction, 02.12.2018 Travel subtasks Direction/Target selection – gaze-directed steering – pointing/gesture steering – discrete selection (lists, environmental/direct targets) – 2D pointing 7 PV182: Human Computer Interaction, 02.12.2018 Travel subtasks Velocity/Acceleration Selection – Constant velocity/acceleration – Gesture-based – Explicit selection (discrete, continuous) – User/environment scaling – Automatic/adaptive 8 PV182: Human Computer Interaction, 02.12.2018 Travel subtasks Input Conditions – Constant travel/no input – Continuous input – Start and stop inputs – Automatic start or stop 9 PV182: Human Computer Interaction, 02.12.2018 Travel tasks with respect to navigation Exploration – travel which has no specific target – build knowledge of environment Search – naïve: travel to find a target whose position is not known – primed: travel to a target whose position is known – build layout knowledge; move to task location Maneuvering – travel to position viewpoint for task – short, precise movements 10 PV182: Human Computer Interaction, 02.12.2018 Travel - Exploration Exploration tasks have no explicit goal of the movement. The user is simply exploring or browsing the 3D space, usually to build knowledge of the environment or to see what the interesting features might be. Travel techniques for exploration should be almost thoughtless – you want the user to be able to simply move about with little restriction. Video: zanbaka1 11 PV182: Human Computer Interaction, 02.12.2018 Travel - Search Search tasks involve traveling to a specific target location in the environment. Psychologists have further subdivided this task into naive search, in which the position of the target is not known, and primed search, in which the position of the target is known (to some degree) and the task is to find that location again. See Darken & Sibert, CHI ’96, for use of these tasks in a VE. 12 PV182: Human Computer Interaction, 02.12.2018 Travel - Maneuvering Maneuvering tasks usually involve short, precise movements where the goal is to change the viewpoint slightly in order to do a particular task. In a surgery simulation, the doctor might need to move to the other side of the operating table to get a good view of the procedure he is performing. In some cases, maneuvering can be allowed simply by letting the user move physically. However, due to limited tracker range, cable tangling problems, etc. you sometimes have to use an explicit travel technique for maneuvering. 13 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Walking techniques – Large-scale tracking – Walking in place (GAITER) - video – CircularFloor - video Treadmills – Single-direction with steering – Omni-directional Bicycles Other physical motion techniques – VMC / Magic carpet – Disney’s river raft ride 14 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Many people have worked on naturalistic travel techniques that use physical motion and some (pseudo) real-world metaphor for travel. Obviously such techniques are useful for applications such as training, where you want the user to move about as she would in the physical world to make the VE as lifelike as possible. However, naturalism is not always the best choice. We present here a few of the natural travel metaphors, but will focus for the remainder of this section on “virtual” or “magic” travel techniques where little or no physical motion is involved. 15 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Naturalism also brings up the issue of realism in general. Too often it is asserted that VEs need to be realistic, without defining what type of realism one means. There is visual realism vs. realism of interaction, for example. Within interactive realism, it might be the case that ITs for certain tasks need to be realistic, while others do not. 16 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Walking techniques: these try to emulate physical walking to allow full use of the vestibular and kinesthetic senses. A brute-force way to allow this is simply to expand the area being tracked. There are also efforts to in some way analyze the user’s motions as he’s walking in place to determine direction, speed, etc. The GAITER project is one example. Also, Iwata’s work has looked at different ways to allow a more natural walking motion while not physically translating, such as roller skates or slick shoes on a slick floor. 17 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Treadmills: also designed to allow physical walking or running, but using a bigger, more expensive device. Brooks at UNC used a standard exercise treadmill outfitted with some steering apparatus for movement in a VE. The Torus treadmill (Iwata, IEEE Virtual Reality, 1999) is the omni-directional treadmill, which allows walking in any direction. video 18 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Bicycles: have been used to simulate real motion based on pedaling and force feedback. In order to be realistic, you need to feel as if you’re on a real bicycle, and simulate turning properly. 19 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Virtual Motion Controller (VMC) – shown in the picture. The VMC has sensors built into it that sense the user’s weight and thus which part of the platform he’s centered over. Direction of motion is controlled by stepping in that direction on the device. Speed is controlled by distance from the center. 20 PV182: Human Computer Interaction, 02.12.2018 Virtual Motion Controller 21 PV182: Human Computer Interaction, 02.12.2018 “Natural” travel metaphors Magic carpet – this was a student project at Georgia Tech very similar to the VMC. Physically, it was a board on which the user stood. 9 regions were painted on the board allowing the user to go in any of the 8 compass directions or stop (by standing in the center). River raft ride at Disney Quest – In this ride, guests sit in a rubber raft on top of an inflatable membrane which can simulate the water’s motion. Motion is controlled by a combination of the motions of all the guests’ paddles, allowing steering, speed control, etc. 22 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors steering: continuous specification of direction of motion – gaze-directed – pointing – physical device (steering wheel, flight stick) target-based: discrete specification of goal – point at object – choose from list – enter coordinates 23 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors A very large number of different virtual travel techniques (where there’s little or no physical motion by the user) have been suggested and/or implemented by researchers and application developers. Most of these techniques fall into four categories or metaphors. 24 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors The steering metaphor implies continuous control of the direction of motion, as when you steer a car. Examples – gaze-directed steering: look in the direction you want to go – pointing: point in the direction you want to go – using physical props such as a steering wheel, accelerator, and brake 25 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors The target-based metaphor lets the user choose the end goal of the motion, then the system actually performs the movement in between the two points. Examples – point at the object to which you want to travel – a menu or list of choices from which you make a selection – enter the coordinates of the point to which you want to move 26 PV182: Human Computer Interaction, 02.12.2018 Target specification via object selection 27 PV182: Human Computer Interaction, 02.12.2018 Map-based target specification 28 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors route planning: one-time specification of path – place markers in world – move icon on map manual manipulation of viewpoint – “camera in hand” – fixed object manipulation 29 PV182: Human Computer Interaction, 02.12.2018 Planning a route with 2D map 30 PV182: Human Computer Interaction, 02.12.2018 Common virtual travel metaphors Route planning is a compromise between steering and target-based metaphors: the user specifies the end goal and also some intermediate path information. Examples – using some manipulation technique, put markers in the world to denote your path – place markers on a map of the space – move an icon representing yourself on a map of the space (e.g. World-in-Miniature, Pausch, Burnette, Brockway, and Weiblen, 1995) 31 PV182: Human Computer Interaction, 02.12.2018 More common travel metaphors Something different is manual viewpoint manipulation, in which the user’s hand motions are mapped in some way to viewpoint motion. Examples – camera in hand: the user moves his hand over a map of the space, with his hand acting as the camera through which the 3D space is viewed – fixed object manipulation: the user selects an object and moves his hand as if to manipulate the object’s position. The object stays fixed and the user moves about the object. The real-world analog of this is grabbing a flagpole: when you move your hand, the flagpole stays put and you move about it. – grabbing the air: grab anywhere in the air or on the world, and move the world relative to yourself rather than moving yourself relative to the world. The realworld analog of this is pulling yourself along a rope. 32 PV182: Human Computer Interaction, 02.12.2018 Evaluation results Steering techniques have similar performance on absolute motion tasks Non-head-coupled steering better for relative motion “Teleportation” can lead to significant disorientation Environment complexity affects the ability to gather information while moving Travel IT and user’s strategies affect spatial orientation 33 PV182: Human Computer Interaction, 02.12.2018 Evaluation results Manipulation-based techniques not requiring an object - are efficient for search and relative motion, but tiring Map-based techniques not effective in unfamiliar environments, or when any precision is required 34 PV182: Human Computer Interaction, 02.12.2018 3 Myths on travel techniques M1: There is one optimal travel technique for VEs. M2: A “natural” technique will always exhibit more performance, usability, and usefulness than another technique. M3: Desktop 3D, workbench, and CAVE applications should use the same travel ITs as HMD-based VEs. NO ! NO ! NO ! 35 PV182: Human Computer Interaction, 02.12.2018 Design guidelines Make simple travel tasks simple (target-based techniques for motion to an object, steering techniques for search). Provide multiple travel techniques to support different travel tasks in the same application. Use graceful transitional motions if overall environment context is important. 36 PV182: Human Computer Interaction, 02.12.2018 Design guidelines Most travel tasks are simple in the mind of the user – they just want to change their location while focusing on something else. Thus, you should use a technique that meets the requirements of the task: – e.g. use a target-based technique if the only goal is to move between known objects • don’t put unnecessary cognitive load on the user. 37 PV182: Human Computer Interaction, 02.12.2018 Design guidelines Remember the differences between tasks such as exploration and primed search – you may need more than one technique. There is a tradeoff between the specificity of the technique and the amount of learning load you want to put on the user. In many cases, multiple techniques requiring a bit more learning time may be much more efficient in the long run. 38 PV182: Human Computer Interaction, 02.12.2018 Design guidelines Many applications require the user to be aware of their location within the space, have an overall survey knowledge of the space, etc. In these cases it is important to use transitional motion between locations, even if it is fast, in order to maintain awareness of the space. 39 PV182: Human Computer Interaction, 02.12.2018 More design guidelines Strategies (how the user uses the technique) are as important as the technique itself, especially in tasks requiring spatial knowledge. Therefore, you should provide training, instructions, and guidance to help the user take advantage of the technique. Cross-task ITs can be useful if travel is not the main interaction, but is only used, for example, to gain a better viewpoint on a manipulation task. Remember that such motion can be tiring, however, and should not be used for very long exposure period applications. 40 PV182: Human Computer Interaction, 02.12.2018 Wayfinding Wayfinding in virtual worlds used to improve navigation in • Real world • Virtual world Source: S Feiner, Columbia U., NY Wayfinding is the cognitive process of defining a path through an environment, using and acquiring spatial knowledge, helped by (artificial) cues 41 PV182: Human Computer Interaction, 02.12.2018 Wayfinding User makes decisions (whereabouts, trajectory) by processing input (environmental information) to produce output (travel) Environmental information stored in LTM as cognitive map Wayfinding uses and builds cognitive map Spatial orientation (knowledge of position and orientation) Situation awareness (spatial orientation + understanding of cognitive map + ability to predict) Task-specific “perception of elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future” — M R Endsley, 1988 Source: S Feiner, Columbia U., NY 42 PV182: Human Computer Interaction, 02.12.2018 Wayfinding Model of navigation : Jul and Fumas (1977) Source: R Darken & B Petersen 2002, vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html 43 PV182: Human Computer Interaction, 02.12.2018 Why is Wayfinding in 3D UIs Hard? • Virtual worlds lack real world cues Missing physical motion constraints E.g., walls you can walk through • Lack of motion (vection) cues Use of virtual motion techniques or other nonisomorphic motion techniques • Constrained visual cues Restricted FOV, detail, frame rate, … Source: S Feiner, Columbia U., NY 6DOF makes wayfinding hard: human beings have different abilities to orient themselves in an environment, extra freedom can disorient people easily 44 PV182: Human Computer Interaction, 02.12.2018 Wayfinding There are several particular differences between real-world wayfinding and wayfinding in a virtual world. The added DoF´s make it harder for human beings to find their way through a VE: whereas we are normally used to being constrained in our actions (i.e. gravity, walking on floors), we can often freely fly through a VE. The unconstrained behavior does not match with how we move through a real environment. This feeling is strengthened by the lack of of realmotion cues we obtain during virtual movement. 45 PV182: Human Computer Interaction, 02.12.2018 Wayfinding Tasks Exploration Browsing Build cognitive map Search Acquire and use spatial knowledge Naïve search Target-based, but user doesn’t know exact target location Primed search Target-based with known target location Relies on cognitive map Maneuvering Small scale movements to refine position/orientation May be subtask of search Specified trajectory movement User travels along predefined path to build cognitive map Source: S Feiner, Columbia U., NY 46 PV182: Human Computer Interaction, 02.12.2018 Spatial Knowledge Landmark knowledge Important characteristics of environment Perceptually prominent objects (landmarks) Visual features: shape, size, color, texture Procedural/route knowledge Sequence of actions needed to follow a path Survey knowledge Topological knowledge of environment Object locations/orientations, distances between objects Map knowledge” Requires longest time to achieve relative to others Source: S Feiner, Columbia U., NY 47 PV182: Human Computer Interaction, 02.12.2018 Approaches to Wayfinding User-centered Address human senses Environment-centered Address design of the environment Source: S Feiner, Columbia U., NY 48 PV182: Human Computer Interaction, 02.12.2018 User-Centered Wayfinding: Field of View Increasing FOV Reduces need for head motion Increases peripheral vision, including optical flow cues Decreases cybersickness Improves ability to understand spatial relationships M Czerwinski, D Tan, and G Robertson, “Women take a wider view,” Proc CHI 2002, 195–202. Subjects navigate in 3D environment presented on Small displays with a narrow field of view Large displays with a wide field of view Results Narrow field of view: Men outperform women Wide field of view: Women and men both perform better, and gender bias is significantly reduced Source: S Feiner, Columbia U., NY 49 PV182: Human Computer Interaction, 02.12.2018 User-Centered Wayfinding Motion (Vection) and Orientation Cues • Visual cues • Vestibular cues Visual/vestibular conflict • Tactile • Proprioceptive • Auditory • Olfactory Walking > Walking in place > Pure virtual C Zanbaka et al., IEEE VR 2004 video Source: S Feiner, Columbia U., NY 50 PV182: Human Computer Interaction, 02.12.2018 User-Centered Wayfinding: Presence Sense of “being there” Measured by Behavior: E.g., duck to avoid being hit, physiological responses Questionnaire responses Aids effectiveness of real-world wayfinding cues Promoted by Improved emulation of real-world effects E.g., lack of lag, increased FOV Virtual body E.g., Looking down and seeing your feet Source: S Feiner, Columbia U., NY 51 PV182: Human Computer Interaction, 02.12.2018 User-Centered Wayfinding: Search Strategies Learn approach to aid effective wayfinding (e.g., based on expert knowledge) Search patterns/paths Switching between egocentric and exocentric views Source: S Feiner, Columbia U., NY 52 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Environment Design Design environment for effective wayfinding Legibility techniques Real-world principles Source: S Feiner, Columbia U., NY 53 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Environment Design: Legibility Legibility “the ease with which [a city’s] parts may be recognised and can be organised into a coherent pattern” — K Lynch, The Image of the City, 1960 Legibility Techniques (K Lynch) Divide environment into distinct parts Organize spatially to clarify relationships among parts Use directional cues to support matching egocentric and exocentric reference frames Source: S Feiner, Columbia U., NY 54 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Environment Design: Legibility Building blocks of cognitive maps – Landmarks: Static, recognizable objects – Districts: Sections of environment with distinct character providing coherence Style (architectural), color, lighting, use – Paths: Major avenues of travel Roads, footpaths – Nodes: Points of interest on paths Intersections, town squares – Edges: Borders to districts or obstacles Waterfront, “wrong side of the tracks” May also serve as paths Source: S Feiner, Columbia U., NY 55 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Environment Design: Real World Principles • Natural environment E.g., horizon, aerial perspective • Built environment (architecture) Illumination for recognizability/emphasis Openings to guide users • Use of color/texture to group and emphasize in both natural and built environment Source: S Feiner, Columbia U., NY 56 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues • Maps • Compasses • Signage • Reference objects • Trails • Grids • Audio/olfactory cues Source: S Feiner, Columbia U., NY 57 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps Graphic representation of an area, drawn to scale “You-are-here” map Includes location marker Orientation North up: Better for exocentric tasks Forward up: Better for egocentric tasks Position/size Fixed vs movable User-controlled vs system-controlled Same display vs separate display R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/ Chapter28/Chapter28.html Source: S Feiner, Columbia U., NY 58 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps Situation awareness aid controlled by head orientation: yaw → yaw, pitch → pitch, position, scale, annotation B Bell, T. Höllerer, and S Feiner, UIST 2002 Source: S Feiner, Columbia U., NY 59 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps Annotations move between real world and aid B Bell, T. Höllerer, and S Feiner, UIST 2002 Source: S Feiner, Columbia U., NY 60 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps B Bell, T. Höllerer, and S Feiner, UIST 2002 Source: S Feiner, Columbia U., NY 61 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps B Bell, T. Höllerer, and S Feiner, UIST 2002 Source: S Feiner, Columbia U., NY 62 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps J LaViola, D Feliz, D Keefe, & R Zeleznik, SI3D 2001 , Step WIM WIM on the floor Invoked by toe tap User walks around WIM Toe tap Dismisses WIM if user looking up Goes to current location if user looking down Source: S Feiner, Columbia U., NY 63 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Maps J LaViola, D Feliz, D Keefe, & R Zeleznik, SI3D 2001 Step WIM WIM scale mode invoked by heel click User walks relative to original position at click to scale WIM up/down User clicks heels again to exit scale mode Source: S Feiner, Columbia U., NY 64 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Compasses Pointer to north or other designated object Typically always visible E.g., Floating compass arrow pointing north R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html Source: S Feiner, Columbia U., NY 65 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Compasses Pointer to north or other designated object Typically always visible E.g., Currently selected object pointer rotating in image plane Columbia Mobile Augmented Reality System www.cs.columbia.edu/graphics/projects/mars/marsUIs.html Source: S Feiner, Columbia U., NY 66 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Signs Displays identifying objects or places E.g., labels Source: S Feiner, Columbia U., NY 67 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Reference Objects Objects of known scale Facilitate measurement of size and distance E.g., people Source: S Feiner, Columbia U., NY 68 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Artificial Landmarks Objects added specifically for wayfinding Distinguishable Positioned for recognizability R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html Source: S Feiner, Columbia U., NY 69 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Trails Objects added to show user’s path • E.g., lines, breadcrumbs, footprints (showing direction) • Could have functionality for following • Cause clutter if left indiscriminately R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html Source: S Feiner, Columbia U., NY 70 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Trails Source: S Feiner, Columbia U., NY Objects added to show user’s path • E.g., lines, breadcrumbs, footprints (showing direction) • Could have functionality for following • Cause clutter if left indiscriminately R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html 71 PV182: Human Computer Interaction, 02.12.2018 Environment-Centered Wayfinding Artificial Cues: Grids Regular ruled overlays Partition environment Allow users to determine current partition, organize search E.g., radial, rectangular R Darken & B Petersen 2002 vehand.engr.ucf.edu/handbook/Chapters/Chapter28/Chapter28.html Source: S Feiner, Columbia U., NY 72 PV182: Human Computer Interaction, 02.12.2018 Wayfinding Evaluation • Time to target • Path analysis Length Unnecessary turns Repetition • Ability to draw layout sketches Number of (in)correct objects Relative/absolute object Size Orientation Position Paths in targeted map searches (with target on map): (A) Forward-up, (B) North-up R Darken & H Cevik, IEEE VR 99 Source: S Feiner, Columbia U., NY 73 PV182: Human Computer Interaction, 02.12.2018 Design Guidelines • Match the cue to the task (e.g., maps for physical environments, but not necessarily for abstract data sets) • Match the cue to the user’s skills • Don’t make cues dominant features Source: T Höllerer, UCSB 74 PV182: Human Computer Interaction, 02.12.2018 Design Guidelines • Choose input devices providing real motion cues if possible • Use animation, avoid teleportation • Integrate travel and wayfinding components Source: T Höllerer, UCSB 75 PV182: Human Computer Interaction, 02.12.2018 Design Guidelines: Maps and Landmarks • Use YAH (You-Are-Here) maps • Consider multiple maps at different scales • Carefully choose orientation • Ensure legibility • Use appropriate size and placement (reduce occlusion) • Carefully place and distinguish landmarks Source: T Höllerer, UCSB