Innovative solutions for behavioral research FaceReader™ Tool for automatic analysis of facial expressions Quick Start Guide Version 8 Information in this document is subject to change without notice and does not represent a commitment on the part of Noldus Information Technology bv. The software described in this document is furnished under a license agreement. The software may be used or copied only in accordance with the terms of the agreement. Copyright © 2018 Noldus Information Technology bv. All rights reserved. No part of this publication may be reproduced, transmitted, transcribed, stored in a retrieval system, or translated into any other language in whole or in part, in any form or by any means, without the written permission of Noldus Information Technology bv. FaceReader is a trademark of VicarVision bv. The Observer XT is a registered trademark of Noldus Information Technology bv. Other product names are trademarks or registered trademarks of their respective companies. Documentation: Olga Krips. December 2018 Noldus Information Technology bv. International headquarters Wageningen, The Netherlands Telephone: +31-317-473300 Fax: +31-317-424496 E-mail: info@ noldus.nl For addresses of our other offices and support, please see our website www.noldus.com. Up and running quickly 3 Up and running quickly This Quick Start Guide guides you through the main steps to analyze facial expressions with FaceReader. Only the most basic features are addressed. Inevitably, some features that may be vital to your application are not discussed. You can find additional information in the FaceReader Help. general information The first part of the Quick Start Guide contains general information on using FaceReader. Follow the instructions to setup your system, install FaceReader and to analyze the facial expressions. You can extend your FaceReader license with a number of modules. project analysis module The Project Analysis Module allows you to create groups of participants based on independent variable values like age and gender, and to analyze average expression values per group. In addition to this, you can view the stimulus video together with the video of the test participant’s face and the FaceReader analysis. The second part of this Quick Start Guide describes the Project Analysis Module. 4 Up and running quickly other modules FaceReader also has the following other modules: Action Unit Analysis Module To analyze a set of 20 action units of the Facial Action Coding System (FACS). These are the action units that are most commonly used. Remote PPG Module To estimate the heart rate and heart rate variability of the subject in front of the camera by means of remote photoplethysmography (RPPG). This is a method based on the fact that changes in blood volume due to pressure pulses cause small changes in the reflectance of the skin. Up and running quickly 5 Consumption Behavior Module To analyze consumption behaviors like Taking a bite and Chewing. Please note that the Consumption Behavior Module is experimental. more information? See the FaceReader Help that opens when you press F1 in the program. It can also be accessed in the Windows apps screen, and can be downloaded from www.noldus.com/downloads. Support If you encounter problems, see www.noldus.com/support-center for a help desk in your area. Note that if you send us videos showing people's faces, you should have permission from those people that you can use the video for that purpose and you may need to sign a form granting consent for us to use those videos. 6 Physical setup Physical setup The physical setup of your experiment is crucial for an accurate analysis with FaceReader. We give the following general guidelines:  Place the camera in front of the test participant and slightly below eye level. Make sure your camera provides images with good contrast and brightness.  Good lighting is crucial. Avoid direct light, reflections and shadows on the face. Make sure the lighting on the face comes directly from the front, for example by placing the setup in front of a window. If necessary, use lights on either side of the monitor, or a professional photo light to increase light intensity, or to compensate for unwanted light sources. Install FaceReader 7 Install FaceReader To install FaceReader 1. Insert the FaceReader installation USB stick into your computer. 2. Double-click the file FaceReader 8 Setup.exe. 3. Select the language for the interface. You have the choice between Chinese (simplified) and English. 4. As Installation type, select Standard. 5. If you bought one of the webcams that are supported with FaceReader, select it in the Drivers and Tools field. 6. Follow the rest of the instructions on your screen to install FaceReader. To install the Stimulus Presentation Tool (Project Analysis Module only) If you have the Project Analysis Module, you need the Stimulus Presentation Tool to present stimuli automatically. To install it: 1. Double-click the file Stimulus Presentation Tool 3 Setup.exe. 2. Select the language for the interface. You have the choice between Chinese (simplified) and English. 3. As Installation type, select Standard. 4. Follow the rest of the instructions on your screen to install the Stimulus Presentation Tool. 8 Work with FaceReader Work with FaceReader setup 1. Dependent on your license, do one of the following: - If you have a hardware key, insert it into the computer and open FaceReader. - If you have site licenses, open FaceReader, choose Start with site license and log in. 2. Create a new Project (File > New > Project). Give the project a name and select a location to store it, or accept the default location. Work with FaceReader 9 3. Choose Participant > Add Participant. Enter a name or identification code for the participant. 4. The gender and age of your participants are added as independent variables and areautomaticallyestimated by FaceReader. To select the age and gender manually, double-click the participant name. click on the pencil button next to the independent variable to enter the values. 5. Add one or more analyses for each participant and choose to analyze from images, video, or live from your camera. To do so, select the participant for which you want to add an analysis in the Project Explorer. click on the Video, Camera or Image button on the toolbar to add the appropriate analysis. It is not possible to mix image analyses with camera or video analyses. To carry out a camera analysis, select the option Use as default camera if you always use this camera. To record audio, select your microphone. Optionally, select Record to create a video file of the test participant’s face. The video will always be saved at a frame rate of 15 frames per second. 10 Work with FaceReader To add multiple video analyses to a participant, right-click the participant’s name, select Add Multiple Video Analyses, and select your videos. analyze 1. Click on the magnifying glass button next to an analysis to open it. 2. Check the options for this analysis in the Settings window in the bottom-left corner of the analysis window. Work with FaceReader 11 To create default settings for each new analysis, choose File > Settings. Open the tab Default Analysis Settings and make your selection. See the FaceReader Help for an explanation of the options. Press F1 in the program to open it. important If you select the Baby Face model, Facial expressions are not available as analysis output. You will obtain Action Unit intensities. See Action Unit Module in the FaceReader Help for more information. 3. Click on the Start analysis button to carry out the analysis. 12 Work with FaceReader tip To analyze all analyses at once, click on the Start batch analysis button on the toolbar. important We recommend to close all visualization windows before you start batch analysis. Bear in mind that carrying out batch analysis on a high number of long videos with a high resolution and frame rate may cause problems. 4. The Image quality bar should cross both dashed lines. If this is not the case, improve lighting or reposition your camera. output FaceReader displays a number of windows with graphical and tabular output. The numbers below correspond to the ones in the figure on page 14. Important notes  Not all visualization options may be available by default. If you do not see some of the options described in this Quick Start Guide, choose File > Settings > Analysis Options and select all options.  If you selected the Baby Face model, not all output is available. See the FaceReader Help for more information. Work with FaceReader 13 Procedure 1. Click on one of the buttons in the Analysis Visualization window to show, for example, the key points in the face or the Facial States. 2. To switch windows, click on the Select window button in the upperright corner of one of the windows and make your selection. See Analysis windows on page 14 for a short description of the options. 3. To zoom, or to copy or save graphs, click on one of the icons on the window toolbar. 4. To show more windows, click on the Split/Unsplit button in the upper-right corner of one of the windows. 14 Work with FaceReader tip If you notice the test participant shows a bias towards some facial expressions, use one of the calibration methods to correct for that. See Analyze Facial Expressions/ Calibrate FaceReader in the FaceReader Help for details. Analysis windows For the upper windows you can choose between:  Analysis Visualization – Click on a button on the left to view how FaceReader analyzes the face.  Subject Characteristics – With, for example, the estimated age and gender.  Facial States – Whether, for example, mouth or eyes are open or closed.  Monitor Identity – If you previously added participant faces to the database, FaceReader shows the test participant’s identify.  Expression Intensity – A chart that displays which of the facial expressions show up in the face.  Expression Summary – A pie chart with the distribution of the facial expressions.  Circumplex Model of Affect – A chart in which emotions are described in a twodimensional circular space, containing arousal on the vertical axis and valence on the horizontal axis. Work with FaceReader 15  Action Unit Intensity – Available with the Action Unit Module.  Heart rate - Available with the Remote PPG Module. For the lower windows you can choose between the line charts:  Timeline – An overview of the facial expressions, and facial states on a timeline.  Valence Line Chart – The valence indicates whether the emotional status of the test participant is positive or negative.  Arousal Line Chart – Arousal indicates whether the test participant is active or not active.  Expression Line Chart – A line chart with the facial expression intensities over time.  Head Orientation Line Chart – The head orientation in X, Y and Z direction on a time line.  Heart Rate and Variability Line Chart – Available with the Remote PPG Module. See FaceReader’s Output in the FaceReader Help for a full description of the analysis windows. export To export your data, choose File > Export. Choose to export the results of the analysis, participant, or entire project. You have the following options:  State log – A text file with the dominant facial expressions as states over time.  Detailed log – A text file with the intensities of all facial expressions over time. To save extra options, like facial states, global gaze direction, valence and arousal values to the log file, select these options in the Data Export tab of the Settings window (File > Settings). Optionally, adjust the sample rate of the export file and select whether to include headers. 16 Work with FaceReader  The Observer XT log – Choose this option if you want to import the analysis results in the annotation software The Observer® XT for further analysis. tip It is also possible to send the analysis results directly during the FaceReader analysis to The Observer XT, using the Noldus network communication program N-Linx. See FaceReader with The Observer XT in the FaceReader Help for details. Project Analysis Module 17 Project Analysis Module setup The general procedure to set up an experiment with FaceReader also applies to the Project Analysis Module. The Project Analysis Module contains the following extra options: 1. Independent Variables - The independent variables Age and Gender are present by default. Choose Project > Independent Variable > Add Independent Variable to add more independent variables, like whether the participants saw the commercial before, or their native language. Double-click Independent Variables under a participant name to score them. Age and Gender can be estimated by FaceReader, or entered manually. 2. Participant Groups – Create participant groups (Participant Group > Add Participant Group) based on the values of the independent variables. For example, create groups based on age, gender, or previous experience of the participants with the commercial. 18 Project Analysis Module 3. Stimuli and Event Markers – Define stimuli or event markers (Project menu) to mark episodes of interest. Stimuli have a fixed duration and can be linked to a video or image to show to the test participants. Note Analyses with the Baby Face model are not included in project analysis. stimulus presentation tool Use the stimulus presentation tool to automatically show the stimuli to the test participants and synchronize them with the analyses. important In a two computer setup, synchronize the computer clock times with a time server. See Synchronize computers with a network time protocol in the FaceReader Help how to do so. On the computer with FaceReader 1. Choose File > Settings > Data Export and under External Communication (API and Stimulus Presentation Tool) select the checkbox Enable External Control. 2. To add tests, open the Tests tab in the bottom pane of the Project Explorer. Then click on the Add test button. 3. Select the camera and the stimuli to show to the test participants. Optionally, choose to let participants enter their own name, age and gender and randomize the presented stimuli. Project Analysis Module 19 On the test participant computer 1. Start the Stimulus Presentation Tool and follow the instructions to connect with the FaceReader computer. 2. Select a test and click Start. 3. Fill in the participant details and click Start again, or let the participant do this. The test and analysis start. 20 Project Analysis Module analyze Participants are automatically added when you use the stimulus presentation tool. Also, analyses are automatically carried out. Watch the Image quality bar when the test runs. The bar should cross both dashed lines. If this is not the case, improve lighting or reposition your camera. Optionally, score event markers during the analysis. output Apart from the general FaceReader output options (page 12), the Project Analysis Module has the following output: Numerical Group Analysis The numbers correspond to the ones in the figure on page 21. 1. Choose Participant Group > Open Numerical Group Analysis to calculate absolute expression intensities for participant groups. Alternatively, click on the Open Numerical Group Analysis button on the toolbar. 2. Choose stimuli or event markers from the lists on the toolbar to get the results per stimulus or event marker. 3. Optionally choose Relative from the Result Type list on the toolbar to calculate expression intensities relative to averages in another part of the analysis. 4. Set the episode from step 3 in the Project Analysis Settings window and then select the stimulus or event marker from the list on the toolbar [2]. 5. Click a participant group or stimulus name to perform a t-tests on the expression intensities. A colored cell indicates a significant difference. An open triangle indicates P<0.05. A closed triangle indicates P<0.01. 6. Choose Export from the toolbar to export the numerical group analysis results for participants or groups. Project Analysis Module 21 7. The bottom window shows bar graphs or box plots of expression intensities, valence, or arousal for the selected participant groups. Temporal Group Analysis The numbers correspond to the ones in the figure on page 22. 1. Choose Participant Group > Open Temporal Group Analysis to view the stimulus video together with a participant video and the analysis for the participant group. 2. Select a stimulus from the list on the toolbar. 3. Select a participant and analysis in the Analysis Visualization window. 4. Optionally choose Relative from the Result Type list on the toolbar to calculate expression intensities relative to the average intensities in another episode. Set the episode in the Project Analysis Settings window [4 in picture on page 21] and select the stimulus from the list on the toolbar [2]. 22 Project Analysis Module 5. The bottom window shows absolute or relative expression intensities, valence, or arousal for the participant group over time, or a pie chart of absolute average expressions.