27/03/2018 1 PA201 Virtual Environments Lecture 6 Virtual Reality Interaction Fotis Liarokapis liarokap@fi.muni.cz 27th March 2018 History of HCI RAND’s vision of the Future http://www.imageshack.us/ (original source unknown) RAND’s vision of the Future • Scientists from the RAND Corporation have created this model to illustrate how a “home computer” could look like in the year 2004 • However the needed technology will not be economically feasible for the average home • Also the scientists readily admit that the computer will require not yet invented technology to actually work, but 50 years from now scientific progress is expected to solve these problems • With teletype interface and the Fortran language, the computer will be easy to use Eniac (1943) • A general view of the ENIAC, the world's first all electronic numerical integrator and computer From IBM Archives Mark I (1944) • The Mark I paper tape readers From Harvard University Cruft Photo Laboratory 27/03/2018 2 ‘As We May Think’ (1945) • Vannevar Bush identified the information storage and retrieval problem: – New knowledge does not reach the people who could benefit from it • Desired for a sort of collective memory machine with his concept of the memex that would make knowledge more accessible – Memex was the hypothetical proto-hypertext system https://en.wikipedia.org/wiki/As_We_May_Think Vannevar Bush Bush’s Memex • Conceiving Hypertext and the World Wide Web – A device where individuals stores all personal books, records, communications etc – Items retrieved rapidly through indexing, keywords, cross references,... – Can annotate text with margin notes, comments... – Can construct and save a trail (chain of links) through the material – Acts as an external memory! Bush’s Memex . • Bush’s Memex based on microfilm records! – Not implemented mmmm mmmm mmm mm mmmm mmm mmmm mmmm mmm mm mmmm mmm mmmm mmmm mmm mm mmmm mmm mmmm mmmm mmm mm mmmm mmm Ball Tracker System (1946) • Ralph Benjamin invented a ball tracker system called the roller ball in 1946 – Invented as part of a post-World War II-era radar plotting system named Comprehensive Display System (CDS) – Kept as a military secret for a few years! – Only fully implemented as a usable device by the Canadian navy some years later in its Digital Automated Tracking and Radar system (DATAR) https://arcadeblogger.com/2016/07/29/the-secret-history-of-the-arcade-trackball/ IBM SSEC (1948) • IBM Selective Sequence Electronic Calculator (SSEC) was an electromechanical computer built by IBM – Design started in late 1944 – Operated from January 1948 to 1952 From IBM Archives https://en.wikipedia.org/wiki/IBM_SSEC DATAR (1952-1953) • The DATAR was a method of giving all ships within a particular fleet a “single view” of their operational status – Collated information provided by sensors placed on board of each vessel https://arcadeblogger.com/2016/07/29/the-secret-history-of-the-arcade-trackball/ 27/03/2018 3 DATAR (1952-1953) . • The trackball could be used by operators of the system to manage and upload this data to the main DATAR mainframe for processing and regurgitation back to each ship – The actual hardware used to display this information was adapted from a radar screen https://arcadeblogger.com/2016/07/29/the-secret-history-of-the-arcade-trackball/ J.C.R. Licklider (1960) • Outlined “man-computer symbiosis” “The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.” IBM 7030 - Stretch (1961-62) • The IBM 7030, also known as Stretch, was IBM's first transistorized supercomputer • Fastest computer in the world from 1961 until 1964 – The first CDC 6600 became operational in 1964 https://en.wikipedia.org/wiki/IBM_7030_Stretch SketchPad (1963) • Ivan Sutherland’s SketchPad-1963 PhD • Sophisticated drawing package • Many ideas/concepts now found in today’s interfaces http://accad.osu.edu/~waynec/history/images/ivan-sutherland.jpg SketchPad Concepts (1963) • Hierarchicalstructures: – Defined pictures and sub- pictures • Object-oriented programming: – Master picture with instances • Constraints: – Details which the system maintains through changes • Icons: – Small pictures that represent more complex items • Copying: – Pictures and constraints • Input techniques: – Efficient use of light pen • World coordinates – Separation of screen from drawing coordinates • Recursive operations – Applied to children of hierarchical objects First Mouse Prototype (1964) • Bill English joined ARC, where he helped Engelbart build the first mouse prototype • Early models had a cord attached to the rear part of the device which looked like a tail – This resembled to the common mouse Douglas Engelbart Douglas Engelbart holding the first computer mouse BillEnglish https://en.wikipedia.org/wiki/Computer_mouse 27/03/2018 4 Xerox Alto (1973) • The Xerox Alto was the first computer designed from its inception to support an operating system based on a graphical user interface (GUI) – Later using the desktop metaphor https://en.wikipedia.org/wiki/Xerox_Alto The Xerox Alto Mouse (1973) • The first mouse driven GUI computer • Featured: – A three button mouse – Utilised a steel ball https://github.com/Gouthamve/Evolution-of-a-mouse/blob/master/README.md ALTAIR 8800 (1974) • The Altair 8800 is a microcomputer designed in 1974 by MITS – Based on the Intel 8080 CPU https://en.wikipedia.org/wiki/Altair_8800 Xerox Star 8010 (1981) • First commercial personal computer designed for ‘business professionals’ – Introductory price : $16,500 (equivalent to $43,467 in 2016) – Discontinued: 1985 • Incorporated various technologies that have since become standard in personal computers – i.e. a bitmapped display, a window-based graphical user interface, icons, folders, mouse (two-button), Ethernet networking, file servers, print servers, and e-mail https://en.wikipedia.org/wiki/Xerox_Star Apple Lisa (1983) • Based upon many ideas in the Star – Predecessor of Macintosh – Somewhat cheaper ($10,000) – Commercial failure http://fp3.antelecom.net/gcifu/applemuseum/lisa2.html Macintosh 128K (1984) • Succeeded because: – Aggressive pricing ($2500) – Learnt from mistakes of Lisa and corrected them - ideas now “mature” – Market now ready for them – Developer’s toolkit encouraged 3rd party non-Apple software – Interface guidelines encouraged consistency between applications – Domination in desktop publishing because of affordable laser printer and excellent graphics https://en.wikipedia.org/wiki/Macintosh 27/03/2018 5 Logitech Wireless Mouse (1991) • In 1991, Logitech released the first wireless mouse: – Logitech’s Cordless MouseMan • Before released: – 1982: a 3-button mouse in – 1990: introduced the MouseMan Left, MouseMan Right and MouseMan Large https://github.com/Gouthamve/Evolution-of-a-mouse/blob/master/README.md Logitech MX-1000 (2004) • In 2004, Logitech MX- 1000 introduced the first optical mouse with Laser baser tracking – Made the mouse experience a lot smoother https://github.com/Gouthamve/Evolution-of-a-mouse/blob/master/README.md Magic Mouse (2009) • Apple released the Magic Mouse in 2009 • Minimalist design • Multi-touch pad https://github.com/Gouthamve/Evolution-of-a-mouse/blob/master/README.md Other Events • MIT Architecture Machine Group – Nicholas Negroponte (1969-1980+) with many innovative inventions: • wall sized displays • use of video disks • use of artificial intelligence in interfaces (idea of agents) • speech recognition merged with pointing • speech production • multimedia hypertext • .... • ACM SIGCHI (1982) – special interest group on computer-human interaction – conferences draw between 2000-3000 people • HCI Journals – Int J Man Machine Studies (1969) – many others since 1982 Interaction Basics Introduction • How should users interact with the virtual world? • How should they move about? • How can they grab and place objects? • How should they interact with representations of each other? • How should they interact with files or the Internet? http://vr.cs.uiuc.edu/ 27/03/2018 6 Universal Simulation Principle • Any interaction mechanism from the real world can be simulated in VR – For example, the user might open a door by turning a knob and pulling – As another example, the user operate a virtual aircraft by sitting in a mock-up cockpit – One could even simulate putting on a VR headset, leading to an experience that is comparable to a dream within a dream! http://vr.cs.uiuc.edu/ A flight simulator used by the US Air Force (photo by Javier Garcia) Terminology • Interaction Technique (IT) – Method for accomplishing a task • 3D application – System that displays 3D information • 3D interaction – Performing user actions in three dimensions http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Relationships http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Virtual-SAP Video Example https://www.youtube.com/watch?v=Xz_J0EK8LLs Universal Interaction Tasks • Navigation – Travel • Motor component (see later on) – Wayfinding • Cognitive component • Selection • Manipulation • System control • Symbolic input http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Selection & Manipulation • Selection – Specifying one or more objects from a set • Manipulation – Modifying object properties • i.e. position, orientation, scale, shape, color, texture, behavior, etc. http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf 27/03/2018 7 Goals of Selection • Indicate action on object • Query object • Make object active • Travel to object location • Set up manipulation http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Selection Performance • Variables affecting user performance – Object distance from user – Object size – Density of objects in area – Occluders http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Common Selection Techniques • Touching with virtual hand • Ray/cone casting • Occlusion / framing • Naming • Indirect selection http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Go-Go Interaction Technique • The Go-Go immersive interaction technique uses the metaphor of interactively growing the user’s arm and non-linear mapping for reaching and manipulating distant objects – Unlike others, this technique allows for seamless direct manipulation of both nearby objects and those at a distance http://www.ivanpoupyrev.com/e-library/1998_1996/uist96.pdf Go-Go Description • Having an arm to grow at will is a compelling prospect • However, implementing this metaphor in VR poses three major challenges: – How to enable users to tell the system when they want to expand their virtual arm – How users can control their virtual arm length – How to make the implementation of this metaphor intuitive, seamless and easy to use http://www.ivanpoupyrev.com/e-library/1998_1996/uist96.pdf Mapping Go-Go • A mapping function F divides the space around the user into two parts: – Linear mapping – Non-linear mapping http://www.ivanpoupyrev.com/e-library/1998_1996/uist96.pdf Mapping function F (k = 1/6) consists of linear (Rr < D) and non-linear parts (Rr > D) Position of real hand is defined by vector Rr ; position of virtual hand is defined by vector Rv 27/03/2018 8 Go-Go Examples With traditional VR manipulation techniques the user cannot reach the vase; reach is limited by arm length The Go-Go technique allows users to expand their reach. The white cube shows the real hand position http://www.ivanpoupyrev.com/e-library/1998_1996/uist96.pdf Selection Classification http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Implementation Issues for Selection Techniques • How to indicate selection event • Object intersections • Feedback – Graphical – Aural – Tactile • Virtual hand avatar • List of selectable objects http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Goals of Manipulation • Object placement – Design – Layout – Grouping • Tool usage • Travel http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Manipulation Metaphors • Simple virtual hand – Natural but limited • Ray casting – Little effort required – Exact positioning and orienting very difficult • Lever arm effect • Hand position mapping – Natural, easy placement – Limited reach, fatiguing, overshoot • Indirect depth mapping – Infinite reach, not tiring – Not natural, separates DOFs http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf HOMER Technique • HOMER (Hand-Centered, Object, Manipulation, Extending, Ray-Casting) – Select: ray-casting – Manipulate: hand http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf 27/03/2018 9 HOMER Metaphors • HOMER (ray-casting + arm-extension) – Easy selection & manipulation – Expressive over range of distances – Hard to move objects away from you http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Motor Programs and Remapping Motor Programs • Throughout our lives, we develop fine motor skills to accomplish many specific tasks – i.e. writing text, tying shoelaces, throwing a ball, and riding a bicycle • These are often called motor programs – Learned through repetitive trials, with gradual improvements in precision and ease as the amount of practice increases http://vr.cs.uiuc.edu/ Motor Programs . • Eventually, we produce the motions without even having to pay attention to them – Most people can drive a car without paying attention to particular operations of the steering wheel, brakes, and accelerator • The same applies to user interfaces – Some devices are easier to learn than others http://vr.cs.uiuc.edu/ Design Considerations • Effectiveness for the task – In terms of achieving the required speed, accuracy, and motion range (if applicable) • Difficulty of learning the new motor programs – Ideally, the user should not be expected to spend many months mastering a new mechanism http://vr.cs.uiuc.edu/ Design Considerations . • Ease of use in terms of cognitive load – The interaction mechanism should require little or no focused attention after some practice • Overall comfort during use over extended periods – The user should not develop muscle fatigue, unless the task is to get some physical exercise http://vr.cs.uiuc.edu/ 27/03/2018 10 Design Considerations .. • To design and evaluate new interaction mechanisms, it is helpful to start by understanding the physiology and psychology of acquiring the motor skills and programs • Must consider the corresponding parts for generating output in the form of body motions in the physical world – In this case, the brain sends motor signals to the muscles, causing them to move, while at the same time incorporating sensory feedback by utilizing the perceptual processes http://vr.cs.uiuc.edu/ Design Considerations ... (a) Part of the cerebral cortex is devoted to motion. (b) Many other parts interact with the cortex to produce and execute motions, including the thalamus, spinal cord, basal ganglion, brain stem, and cerebellum. (Figures provided by The Brain from Top to Bottom, McGill University http://vr.cs.uiuc.edu/ Neurophysiology of Movement • Consider the neural hardware involved in learning, control, and execution of voluntary movements • Some parts of the cerebral cortex are devoted to motion – The primary motor cortex is the main source of neural signals that control movement – The premotor cortex and supplementary motor area appear to be involved in the preparation and planning of movement http://vr.cs.uiuc.edu/ Atari 2600 Paddle controller Neurophysiology of Movement . • Many more parts are involved in motion and communicate through neural signals • The most interesting part is the cerebellum, meaning ‘little brain’ – Located at the back of the skull • It seems to be a special processing unit that is mostly devoted to motion, but is also involved in functions such as attention and language http://vr.cs.uiuc.edu/ The Atari Breakout game, in which the bottom line segment is a virtual paddle that allows the ball to bounce to the top and eliminate bricks upon contacts Neurophysiology of Movement .. • Damage to the cerebellum has been widely seen to affect fine motor control and learning of new motor programs – It has been estimated to contain around 101 billion neurons, which is far more than the entire cerebral cortex, which contains around 20 billion • Even though the cerebellum is much smaller, a large number is achieved through smaller, densely packed cells • In addition to coordinating fine movements, it appears to be the storage center for motor programshttp://vr.cs.uiuc.edu/ Neurophysiology of Movement ... • One of the most relevant uses of the cerebellum for VR is in learning sensorimotor relationships, which become encoded into a motor program • All body motions involve some kind of sensory feedback – The most common example is hand-eyecoordination • Developing a tight connection between motor control signals and sensory and perceptual signals is crucial to many tasks – This is also widely known in engineered systems, in which sensor-feedback and motor control are combined in applications such as robotics and aircraft stabilization; the subject that deals with this is called control systems http://vr.cs.uiuc.edu/ 27/03/2018 11 Neurophysiology of Movement .... • One of the most important factors is how long it takes to learn a motor program – Great variation across humans • A key concept is neuroplasticity – Which is the potential of the brain to reorganize its neural structures and form new pathways to adapt to new stimuli http://vr.cs.uiuc.edu/ Neurophysiology of Movement ..... • Toddlers (children 12 to 36 months old) have a high level of neuroplasticity – Which becomes greatly reduced over time through the process of synaptic pruning • This causes healthy adults to have about half as many synapses per neuron than a child of age two or three – Unfortunately, the result is that adults have a harder time acquiring new skills such as learning a new language or learning how to use a complicated interface http://vr.cs.uiuc.edu/ Learning Motor Programs • Typical example is the computer mouse – The sensorimotor mapping seems a bit complicated – Young children seem to immediately learn how to use the mouse, whereas older adults require some practice Motor Programs for VR • A perceptual experience is controlled by body movement that is sensed through a hardware device – Using the universal simulation principle, any of these and more could be brought into a VR system – The physical interaction part might be identical or it could be simulated through another controller – Using the tracking methods the position and orientation of body parts could be reliably estimated and brought into VR – For the case of head tracking, it is essential to accurately maintain the viewpoint with high accuracy and zero effective latency • Otherwise, the VR experience is significantly degraded • The motion of the sense organ must be matched by a tracking system http://vr.cs.uiuc.edu/ Remapping • For the motions of other body parts, this perfect matching is not critical • Our neural systems can instead learn associations that are preferable in terms of comfort – In the same way as the mouse, and keyboard work in the real world • Thus, we want to do remapping, which involves learning a sensorimotor mapping that produces different results in a virtual world than one would expect from the real world – The keyboard example above is one of the most common examples of remapping • The process of pushing a pencil across paper to produce a letter has been replaced by pressing a key • The term remapping is even used with keyboards to mean the assignment of one or more keys to another key http://vr.cs.uiuc.edu/ Remapping . • Remapping is natural for VR – For example, rather than reaching out to grab a virtual door knob, one could press a button to open the door – For a simpler case, consider holding a controller for which the pose is tracked through space, as allowed by the HTC Vive system • A scaling parameter could be set so that one centimeter of hand displacement in the real world corresponds to two centimeters of displacement in the virtual world • This is similar to the scaling parameter for the mouse http://vr.cs.uiuc.edu/ 27/03/2018 12 Locomotion Introduction to Locomotion • Suppose that the virtual world covers a much larger area than the part of the real world that is tracked – The matched zone is small relative to the virtual world • Some form of interaction mechanism is needed to move the user in the virtual world while she remains fixed within the tracked area in the real world • An interaction mechanism that moves the user in this way is called locomotion – It is as if the user is riding in a virtual vehicle that is steered through the virtual world http://vr.cs.uiuc.edu/ Locomotion Spectrum • Moving from left to right, the amount of viewpoint mismatch between real and virtual motions increases http://vr.cs.uiuc.edu/ Redirected Walking • Redirected walking is a technique where a user is tracked through a very large space, and it is possible to make the user think that is walking in straight lines for kilometers while in fact walking in circles http://vr.cs.uiuc.edu/ Redirected Walking . • Walking along a straight line over long distances without visual cues is virtually impossible for humans (and robots!) – Because in the real world it is impossible to achieve perfect symmetry • One direction will tend to dominate through an imbalance in motor strength and sensory signals, causing people to travel in circles http://vr.cs.uiuc.edu/ Redirected Walking .. • Imagine a VR experience in which a virtual city contains long, straight streets – As the user walks down the street, the yaw direction of the viewpoint can be gradually varied – This represents a small amount of mismatch between the real and virtual worlds, and it causes the user to walk along circular arcs http://vr.cs.uiuc.edu/ 27/03/2018 13 Redirected Walking ... • The main trouble with this technique is that the user has free will and might decide to walk to the edge of the matched zone in the real world – Even if he cannot directly perceive it • In this case, an unfortunate, disruptive warning might appear, suggesting that he must rotate to reset the yaw orientation http://vr.cs.uiuc.edu/ Locomotion Implementation • Consider of sitting down and wearing a headset – Middle cases from Locomotion Spectrum • Locomotion can then be simply achieved by moving the viewpoint with a controller – Think of the matched zone as a controllable cart that moves across the ground of the virtual environment http://vr.cs.uiuc.edu/ Locomotion along a horizontal terrain can be modeled as steering a cart through the virtual world. A top-down view is shown. The yellow region is the matched zone, in which the user's viewpoint is tracked. The values of xt, zt, and θ are changed by using a controller Locomotion Implementation . • First consider the simple case in which the ground is a horizontal plane • Let Ttrack denote the homogeneous transform that represents the tracked position and orientation of the cyclopean (center) eye in the physical world http://vr.cs.uiuc.edu/ Locomotion Implementation .. • The position and orientation of the cart is determined by a controller • The homogeneous matrix: – encodes the position (xt, zt) and orientation θ of the cart (as a yaw rotation) • The height is set at yt = 0 so that it does not change the height determined by tracking or other systems http://vr.cs.uiuc.edu/ Locomotion Implementation ... • The eye transform is obtained by chaining Ttrack and Tcart to obtain: • To move the viewpoint for a fixed direction θ, the xt and zt components are obtained by integrating a differential equation: • The variable s is the forward speed http://vr.cs.uiuc.edu/ Locomotion Implementation .... • Integrating over a time step Δt, the position update appears as: • The average human walking speed is about 1.4 meters per second – The virtual cart can be moved forward by pressing a button or key that sets s = 1.4 – Another button can be used to assign s = -1.4, which would result in backward motion – If no key or button is held down, then s = 0, which causes the cart to remain stopped http://vr.cs.uiuc.edu/ 27/03/2018 14 Locomotion Implementation ..... • An alternative control scheme is to use the two buttons to increase or decrease the speed, until some maximum limit is reached – In this case, motion is sustained without holding down a key • Keys could also be used to provide lateral motion, in addition to forward/backward motion – This is called strafing in video games and should be avoided (if possible) http://vr.cs.uiuc.edu/ Changing Direction Issues • Consider the orientation θ and to move in a different direction, it needs to be reassigned – The assignment could be made based on the user's head yaw direction – This becomes convenient and comfortable when the user is sitting in a swivel chair and looking forward • By rotating the swivel chair, the direction can be set • However, this could become a problem for a wired headset because the cable could wrap around the user http://vr.cs.uiuc.edu/ Changing Direction Issues . • In a fixed chair, it may become frustrating to control θ because the comfortable head yaw range is limited to only $ 60$ degrees in each direction – In this case, buttons can be used to change θ by small increments in clockwise or counterclockwise directions • Unfortunately, changing θ according to constant angular velocity causes yaw vection, which is nauseating to many people – Some users prefer to tap a button to instantly yaw about 10 degrees each time – If the increments are too small, then vection appears again, and if the increments are too large, then users become confused about their orientation http://vr.cs.uiuc.edu/ Changing Direction Issues .. • Another issue is where to locate the center of rotation – What happens when the user moves his head away from the center of the chair in the real world? – Should the center of rotation be about the original head center or the new head center? On the right the yaw rotation axis is centered on the head, for a user who is upright in the chair. On the left, the user is leaning over in the chair. Should the rotation axis remain fixed, or move with the user? http://vr.cs.uiuc.edu/ Changing Direction Issues ... • If it is chosen as the original center, then the user will perceive a large translation as θ is changed – However, this would also happen in the real world if the user were leaning over while riding in a cart On the right the yaw rotation axis is centered on the head, for a user who is upright in the chair. On the left, the user is leaning over in the chair. Should the rotation axis remain fixed, or move with the user? http://vr.cs.uiuc.edu/ Vection Reduction Strategies • The main problem with locomotion is vection, which leads to VR sickness – Six different kinds of vection occur, one for each DOF – Numerous factors affect the sensitivity to vection • Reducing the intensity of these factors should reduce vection – And hopefully VR sickness • Several strategies for reducing vection-based VR sickness exist http://vr.cs.uiuc.edu/ 27/03/2018 15 Vection Reduction Strategies . • 1) If the field of view for the optical flow is reduced, then the vection is weakened – A common example is to make a cockpit or car interior that blocks most of the optical flow • 2) If the viewpoint is too close to the ground, then the magnitudes of velocity and acceleration vectors of moving features are higher – This is why you might feel as if you are traveling faster in a small car that is low to the ground in comparison to riding at the same speed in a truck or minivan http://vr.cs.uiuc.edu/ Vection Reduction Strategies .. • 3) Surprisingly, a larger mismatch for a short period of time may be preferable to a smaller mismatch over a long period of time http://vr.cs.uiuc.edu/ (a) Applying constant acceleration over a time interval to bring the stopped avatar up to a speed limit. The upper plot shows the speed over time. The lower plot shows the acceleration. The interval of time over which there is nonzero acceleration corresponds to a mismatchwith the vestibular sense. (b) In this case, an acceleration impulse is applied, resulting in the desired speed limit being immediately achieved. In this case, the mismatchoccurs over a time interval that is effectively zero length. In practice, the perceived speed changes in a single pair of consecutive frames. Surprisingly, case (b) is much more comfortable than (a). It seems the brain prefers an outlier for a very short time interval, as supposed to a smaller, sustained mismatchover a longer time interval (such as 5 seconds) Vection Reduction Strategies ... • 4) Having high spatial frequency will yield more features for the human vision system to track – Therefore, if the passing environment is smoother, with less detail, then vection should be reduced • 5) Reducing contrast – Such as making the world seem hazy or foggy while accelerating, may help. • 6) Providing other sensory cues such as blowing wind or moving audio sources might provide stronger evidence of motion – Including vestibular stimulation in the form of a rumble or vibration may also help lower the confidence of the vestibular signal http://vr.cs.uiuc.edu/ Vection Reduction Strategies .... • 7) If the world is supposed to be moving, rather than the user, then making it clear through cues or special instructions can help • 8) Providing specific tasks, such as firing a laser at flying insects, may provide enough distraction from the vestibular conflict – If the user is instead focused entirely on the motion, then she might become sick more quickly • 9) The adverse effects of vection may decrease through repeated practice – People who regularly play FPS games in front of a large screen already seem to have reduced sensitivity to vection in VR http://vr.cs.uiuc.edu/ Non-planar Locomotion • Consider more complicated locomotion cases – If the user is walking over a terrain, then the y component can be simply increased or decreased to reflect the change in altitude • In the case of moving through a 3D medium, all six forms of vection become enabled – Common settings include a virtual spacecraft, aircraft, or scuba diver – Yaw, pitch, and roll vection can be easily generated http://vr.cs.uiuc.edu/ Non-planar Locomotion . • Adding special effects that move the viewpoint will cause further difficulty with vection – For example, making an avatar jump up and down will cause vertical vection http://vr.cs.uiuc.edu/ https://www.frontiersin.org/articles/10.3389/fpsyg.2016.00039/full 27/03/2018 16 Specialized Hardware • Many kinds of hardware have been developed to support locomotion – One of the oldest examples is to create an entire cockpit for aircraft flight simulation http://vr.cs.uiuc.edu/ (a) An omnidirectional treadmill used in a CAVE system by the US Army for training. (b) A home-brew bicycle riding system connected to a VR headset, developed by Paul Dyan Teleportation • In VR we move in ways that are physical implausible – The user is immediately transported to another location. • How is the desired location determined? • One simple mechanism is a virtual laser pointer (or 3D mouse) – which is accomplished by the user holding a controller that is similar in shape to a laser pointer in the real world • A smart phone could even be used – The user rotates the controller to move a laser dot in the virtual world – This requires performing a ray casting operation to find the nearest visible triangle, along the ray that corresponds to the laser light http://vr.cs.uiuc.edu/ Teleportation . • To select a location where the user would prefer to stand simply point the virtual laser and press a key to be instantly teleported • To make pointing at the floor easier, the beam could actually be a parabolic arc that follows gravity – Places that are not visible can be selected using: popup maps, text-based searches or voice commands http://vr.cs.uiuc.edu/ A virtual ``laser pointer'' that follows a parabolic arc so that a destination for teleportation can be easily specified as a point on the floor. (Image from the Budget Cuts game on the HTC Vive platform.) Teleportation .. • One method, involves showing the user a virtual small-scale version of the environment – This is effectively a 3D map http://vr.cs.uiuc.edu/ https://play.google.com/store/apps/details?id=com.virtuala migos.vrzombie Wayfinding • The cognitive problem of learning a spatial representation and using it to navigate is called wayfinding • One trouble with locomotion systems that are not familiar in the real world is that users might not learn the spatial arrangement of the world around them – Would your brain still form place cells for an environment in the real world if you were able to teleport from place to place? – We widely observe this phenomenon with people who learn to navigate a city using only GPS or taxi services, rather than doing their own wayfinding http://vr.cs.uiuc.edu/ Wayfinding . • The teleportation mechanism reduces vection, and therefore VR sickness – However, it may come at the cost of reduced learning of the spatial arrangement of the environment • When performing teleportation, it is important not to change the yaw orientation of the viewpoint – Otherwise, the user may become eve more disoriented http://vr.cs.uiuc.edu/ 27/03/2018 17 Manipulation Introduction • The virtual world does not have to follow the complicated physics of manipulation – It is instead preferable to make operations such as selecting, grasping, manipulating, carrying,and placing an object as fast and easy as possible – Extensive reaching or other forms of muscle strain should be avoided, unless the VR experience is designed to provide exercise http://vr.cs.uiuc.edu/ Tom Cruise moving windows around on a holographic display in the 2002 movie Minority Report. It is a great-looking interaction mechanism for Hollywood, but it is terribly tiring in reality. The user would quickly experience gorilla arms. Avoid Gorilla Arms • One of the most common misconceptions among the public is that the interface used by Tom Cruise in the movie Minority Report is desirable – Previous slide • In fact, it quickly leads to the well-known problem of gorilla arms, in which the user quickly feels fatigue from extended arms • How long can you hold your arms directly in front of yourself without becoming fatigued? http://vr.cs.uiuc.edu/ Selection • One of the simplest ways to select an object in the virtual world is with the virtual laser pointer – Several variations may help to improve the selection process • With a pointer, the user simply illuminates the object of interest and presses a button – If the goal is to retrieve the object, then it can be immediately placed in the user's virtual hand or inventory – If the goal is to manipulate the object in a standard, repetitive way, then pressing the button could cause a virtual motor program to be executed http://vr.cs.uiuc.edu/ Selection . • If the object is hard to see, then the selection process may be complicated – It might be behind the user's head, which might require uncomfortable turning – The object could be so small or far away that it occupies only a few pixels on the screen, making it difficult to precisely select it http://vr.cs.uiuc.edu/ Manipulation • If the user carries an object over a long distance, then it is not necessary to squeeze or clutch the controller – This would yield unnecessary fatigue • In some cases, the user might be expected to carefully inspect the object while having it in possession – For example, he might want to move it around in his hand to determine its 3D structure – The object orientation could be set to follow exactly the 3D orientation of a controller that the user holds – The user could even hold a real object in hand that is tracked by external cameras, but has a different appearance in the virtual world • This enables familiar force feedback to the user http://vr.cs.uiuc.edu/ 27/03/2018 18 Placement • Consider ungrasping the object and placing it into the world • An easy case for the user is to press a button and have the object simply fall into the right place • This is accomplished by a basin of attraction which is an attractive potential function defined in a neighborhood of the target pose (position and orientation) http://vr.cs.uiuc.edu/ To make life easier on the user, a basin of attraction can be defined around an object so that when the basin in entered, the dropped object is attracted directly to the target pose Placement . • Alternatively, the user may be required to delicately place the object • Perhaps the application involves stacking and balancing objects as high as possible • In this case, the precision requirements would be very high – Placing a burden on both the controller tracking system and the user http://vr.cs.uiuc.edu/ Remapping • The simplest case is the use of the button to select, grasp, and place objects – Instead of a button, continuous motions could be generated by the user and tracked by systems – Examples include turning a knob, moving a slider bar, moving a finger over a touch screen, and moving a free-floating body through space • Recall that one of the most important aspects of remapping is easy learnability – Reducing the number of degrees of freedom that are remapped will generally ease the learning process • To avoid gorilla arms and related problems, a scaling factor could be imposed on the tracked device so that a small amount of position change in the controller corresponds to a large motion in the virtual world http://vr.cs.uiuc.edu/ Current Systems • The development of interaction mechanisms for manipulation remains one of the greatest challenges for VR – Current generation consumer VR headsets either leverage existing game controllers, as in the bundling of the XBox 360 controller with the Oculus Rift in 2016, or introduce systems that assume large hand motions are the norm, as in the HTC Vive headset controller http://vr.cs.uiuc.edu/ (a) A pair of hand-held controllers that came with the HTC Vive headset in 2016; the device includes side buttons, a trigger, and a touch pad for the thumb. (b) A user trying the controllers (prototype version) Current Systems . • Others are developing gesturing systems that involve no hardware in the hands, as in the Leap Motion system • Rapid evolution of methods and technologies for manipulation can be expected in the coming years, with increasing emphasis on user comfort and ease of use http://vr.cs.uiuc.edu/ (a) The hand model used by Leap Motion tracking. (b) The tracked model superimposed in an image of the actual hands Social Interaction 27/03/2018 19 Introduction • Communication and social interaction are vast subjects • Social interaction in VR (or social VR) remains in a stage of infancy, with substantial experimentation and rethinking of paradigms occurring • Connecting humans together is one of the greatest potentials for VR technology • Although it might seem isolating to put displays between ourselves and the world around us, we can also be brought closer together through successful interaction mechanisms • This section highlights several interesting issues with regard to social interaction, rather than provide a complete review http://vr.cs.uiuc.edu/ Shannon-Weaver Model • An important factor is how many people will be interacting through the medium – Start with a pair of people • One of the most powerful mathematical models ever developed is the Shannon-Weaver model of communication – Used in designing communication systems in engineering http://vr.cs.uiuc.edu/ The classical Shannon-Weaver model of communication (from 1948). The sender provides a messageto the encoder, which transmits the messagethrough a channel corrupted by noise. At the other end, a decoder converts the messageinto a suitable format for the receiver Beyond Shannon-Weaver Model • This model is powerful in that it mathematically quantifies human interaction – But it is also inadequate for covering the kinds of interactions that are possible in VR • By once again following the universal simulation principle, any kind of human interaction that exists in the real world could be brought into VR – Simple gestures and mannerisms can provide subtle but important components of interaction that are not captured by the classical communication model http://vr.cs.uiuc.edu/ From Avatars to Visual Capture • How should others see you in VR? – This is one of the most intriguing questions becauseit depends on both the social context and on the technological limitations • Many possibilities exist! • A user may represent himself through an avatar – Might not correspond at all to his visible, audible, and behavioral characteristics http://vr.cs.uiuc.edu/ A collection of starter avatars offered by Second Life From Avatars to Visual Capture . • At the other extreme, a user might be captured using imaging technology and reproduced in the virtual world with a highly accurate 3D representation • In this case, it may seem as if the person were teleported directly from the real world to the virtual world – Many other possibilities exist along this spectrum, and it is worth considering the tradeoffs http://vr.cs.uiuc.edu/ Holographic communication research from Microsoft in 2016. A 3D representation of a person is extracted in real time and superimposed in the world, as seen through augmented reality glasses (Hololens) From Avatars to Visual Capture .. • One major appeal of an avatar is anonymity – Offers the chance to play a different role or exhibit different personality traits in a social setting • In a phenomenon called the Proteus effect, it has been observed that a person's behavior changes based on the virtual characteristics of the avatar – Similar to the way in which people have been known to behave differently when wearing a uniform or costume • The user might want to live a fantasy, or try to see the world from a different perspective – i.e. people might develop a sense of empathy if they are able to experience the world from an avatar that appears to be different in terms of race, gender, height, weight, age, and so on http://vr.cs.uiuc.edu/ 27/03/2018 20 From Avatars to Visual Capture ... • Users may also want to experiment with other forms of embodiment – For example, a group of children might want to inhabit the bodies of animals while talking and moving about – Imagine if you could have people perceive you as if you as an alien, an insect, an automobile, or even as a talking block of cheese • People were delightfully surprised in 1986 when Pixar brought a desk lamp to life in the animated short Luxo Jr. Hollywood movies over the past decades have been filled with animated characters – And we have the opportunity to embody some of them while inhabiting a virtual world! http://vr.cs.uiuc.edu/ Moving Toward Physical Realism • Based on the current technology, three major kinds of similarities can be independently considered: – Visual appearance • How close does the avatar seem to the actual person in terms of visible characteristics? – Auditory appearance • How much does the sound coming from the avatar match the voice, language, and speech patterns of the person? – Behavioral appearance • How closely do the avatar's motions match the body language, gait, facial expressions, and other motions of the person? http://vr.cs.uiuc.edu/ Visual Appearance • The first kind of similarity could start to match the person by making a kinematic model in the virtual world that corresponds in size and mobility to the actual person • Other simple matching such as hair color, skin tone, and eye color could be performed • To further improve realism, texture mapping could be used to map skin and clothes onto the avatar – i.e. a picture of the user's face could be texture mapped onto the avatar face • Highly accurate matching might also be made by constructing synthetic models, or combining information from both imaging and synthetic sources http://vr.cs.uiuc.edu/ Visual Appearance Example http://vr.cs.uiuc.edu/ The Digital Emily project from 2009: (a) A real person is imaged. (b) Geometric models are animated along with sophisticated rendering techniques to produce realistic facial movement Auditory Appearance • For the auditory part, users of Second Life and similar systems have preferred text messaging – This interaction is treated as if they were talking aloud, in the sense that text messages can only be seen by avatars that would have been close enough to hear it at the same distance in the real world – Texting helps to ensure anonymity • Recording and reproducing voice is simple in VR, making it much simpler to match auditory appearance than visual appearance • One must take care to render the audio with proper localization, so that it appears to others to be coming from the mouth of the avatar http://vr.cs.uiuc.edu/ Behavioral Appearance • The behavioral experience could be matched perfectly, – While the avatar has a completely different visual appearance • This is the main motivation for motion capture systems, in which the movements of a real actor are recorded and then used to animate an avatar in a motion picture – Note that movie production is usually a long, off-line process – Accurate, real-time performance that perfectly matches the visual and behavioral appearance of a person is currently unattainable in low-cost VR systems • Furthermore, capturing the user's face is difficult if part of it is covered by a headset, although some recent progress has been made in this area http://vr.cs.uiuc.edu/ 27/03/2018 21 Behavioral Appearance . • Current tracking systems can be leveraged to provide accurately matched behavioral appearance: – i.e. head tracking can be directly linked to the avatar head so that others can know where the head is turned – Users can also understand head nods or gestures, such as ``yes'' or ``no'' http://vr.cs.uiuc.edu/ Oculus Social Alpha, which was an application for Samsung Gear VR. Multiple users could meet in a virtual world and socialize. In this case, they are watching a movie together in a theater. Their head movements are provided using head tracking data. They are also able to talk to each other with localized audio From one-on-one to Societies • Consider social interaction on different scales • One important aspect of one-on-one communication is whether the relationship between the two people is symmetrical or complementary – In a symmetrical relationship the two people are of equal status – In a complementary relationship one person is in a superior position, as in the case of a boss and employee or a parent and a child • This greatly affects the style of interaction, particularly in a targeted activity http://vr.cs.uiuc.edu/ Additional Interaction Mechanisms Media Interaction • The content of the Internet can be brought into VR in numerous ways by following the universal simulation principle – A web browser could appear on a public display in the virtual world or on any other device that is familiar to users in the real world – Alternatively, a virtual screen may float directly in front of the user, while a stable, familiar background is provided The Valve Steam game app store when viewed in the HTC Vive headset http://vr.cs.uiuc.edu/ Text Entry and Editing • One option is to track a real keyboard and mouse, making them visible VR • Tracking of fingertips may also be needed to provide visual feedback • This enables a system to be developed that magically transforms the desk and surrounding environment into anything • Much like the use of a background image on a desktop system, a relaxing panoramic image or video could envelop the user while she works • For the actual work part, rather than having one screen in front of the user, a number of screens or windows could appear all around and at different depths http://vr.cs.uiuc.edu/ 3D Design and Visualization • VR offers the ability to interact with and view 3D versions of a design or data set – This could be from the outside looking in, perhaps at the design of a new kitchen utensil – It could also be from the inside looking out, perhaps at the design of a new kitchen • Viewing a design in VR can be considered as a kind of virtual prototyping, before a physical prototype is constructed – This enables rapid, low-cost advances in product development cycles http://vr.cs.uiuc.edu/ 27/03/2018 22 Review of Interaction for VR Taxonomy of Virtual Object Manipulation Techniques • Poupyrev, et al. (1999) developed taxonomy of the virtual interaction techniques and performed empirical evaluation of the techniques • Categorized the techniques into “ – Egocentric – Exocentric http://ieomsociety.org/ieom2014/pdfs/360.pdf Taxonomyof virtual object manipulation techniques (Poupyrev, et al., 1998) Technique Classification by Components http://courses.cs.vt.edu/cs5754/lectures/interaction_part1.pdf Taxonomy of Selection/Manipulation Techniques http://ieomsociety.org/ieom2014/pdfs/360.pdf Conclusions • Many more forms of interaction can be imagined, even by just applying the universal simulation principle – Video games have already provided many ideas for interaction via a standard game controller • Beyond that, the Nintendo Wii remote has been especially effective in making virtual versions of sports activities such as bowling a ball or swinging a tennis racket • What new interaction mechanisms will be comfortable and effective for VR? • If displays are presented to senses other than vision, then even more possibilities emerge – i.e. could you give someone a meaningful hug on the other side of the world if they are wearing a suit that applies the appropriate forces to the body? http://vr.cs.uiuc.edu/ Questions