Human Computer Interaction (HCI) prof. David Smahel, Ph.D. “Humans are the weakest link in cyber security.” Uživatel (znalosti, osobní charakteristiky…) Počítač - interface (informatika) HCI Jaké různé obory vstupují do HCI a jakým způsobem? Sociální a humanitní vědy Psychologie x Sociologie x Mediální studia x Pedagogika x Sociální práce x Právo x Lingvistika x ..... Co je HCI §Carroll, J. M. (2009). Human computer interaction (HCI). Interaction Design Encyclopedia. Retrieved on June, 6, 2010. Obsah obrázku text, snímek obrazovky, řada/pruh, diagram Popis byl vytvořen automaticky When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Co je HCI §Experti HCI studují, jak lidé komunikují s počítači, a poté vyvíjejí nové technologie, které lidem pomohou používat počítače efektivněji. Cílem HCI je minimalizovat náklady na interakci a učinit interakce „lidštějšími“. §Oblast HCI si klade za cíl zlepšit interakci člověka a počítače prostřednictvím zlepšení funkčnosti, použitelnosti, spolehlivosti a pohodlí počítačových rozhraní. § § § §(Association of HCI: https://www.hci.org.uk/human-computer-interaction) When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Oblasti HCI §User experience (+ metody výzkumu pro UE) §Interaction design §Usable security §Gaming §Augmented reality & Virtual reality §Human AI Interactions (Explainable AI, User-friendly AI) §Interaction with robots §Health HCI §Assistive technology §Speech recognition technology (Alexa …) §Eye tracking § § § § § When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Použitelnost (usability) §Použitelnost jako „kvalita interakce ve smyslu parametrů, jako je čas potřebný k provedení úkolů, počet chyb a čas potřebný k tomu, abychom se stali kompetentním uživatelem“ (Benyon et al. 2005, s. 52). §atribut kvality, který posuzuje, jak snadno se uživatelská rozhraní používají § § § § § § § §(Issa, T., & Isaias, P. (2022). Usability and human–computer interaction (hci). In Sustainable Design: HCI, Usability and Environmental Concerns (pp. 23-40). London: Springer London.) Obsah obrázku text, snímek obrazovky, Písmo, design Popis byl vytvořen automaticky When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Kritéria použitelnosti §Learnability: pomocí které mohou noví uživatelé zahájit efektivní interakci a dosáhnout maximálního výkonu. §Flexibilita: množství způsobů, jak si uživatel a systém vyměňují informace. §Robustnost: úroveň podpory poskytované uživateli při určování úspěšného dosažení a hodnocení cílů. §Efektivita: jakmile se uživatel „naučí systém“, je to rychlost, s jakou může provádět úkoly. §Zapamatovatelnost: jak snadno si uživatel zapamatuje funkce systému po určité době, kdy jej nepoužívá. §Chyby: Kolik chyb uživatelé dělají, jak závažné jsou §Spokojenost: jak příjemná a příjemná je práce se systémem §(Issa, T., & Isaias, P. (2022). Usability and human–computer interaction (hci). In Sustainable Design: HCI, Usability and Environmental Concerns (pp. 23-40). London: Springer London.) When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Usable security: Experimental research of ICT user behavior in the domain of security §David Smahel §Vaclav Matyas §Vlasta Stavova §Lenka Dedkova §Hana Machackova §Kamil Malinka §Radim Polcak § IRTIS ESET: Potentially unwanted applications §How to encourage users to enable PUA (potentially unwanted application) detection? § §Increase user's security by increasing number of users who enable detecting PUA (spyware, adware, etc.) during antivirus installation process § §Legal issues – user‘s own decision, no recommendation § §PC platform § When installing antivirus program on PC, one of steps in installation is user dialog whether user want or not also detect potentially unwatend application. Potentially unwanted application is spyware and adware that lays in „greay zone“. They are not as harmful as viruses but stil may spy users or make his PC slow and so no. The very interesting is the legal aspect of whole challenge. PUA producers have tendency to sue ESET when detect/do not detect options will not be equal. Staňte se UX designery! Základní obrazovka, jedná se o jeden krok v instalaci. ESET: Potentially unwanted applications §Experiment 1: §designed 15 versions (incl. original one) §tested on beta users, N = 87 028 § §Experiment 2: §repeated on standard users §2.1 N = 499 142 §2.2 N = 600 000+ (still collecting; prior to data cleaning) §difference in behavior of beta x standard users § Proposed variants 15 We used several good practises, such as adding a pictorial in contrast colors, add structure, text in a bold type, provide example, variant with and without explanatory text, etc. The changes we made may seem subtle, but it was impossible to make more Changes we made may seem subtle, but a conceptual redesign was out of question due to several limitations imposed by the company. However, we feel that even with our ``subtle'' changes we were able to incorporate some traditional warning design features. Proposed variants 16 We used several good practises, such as adding a pictorial in contrast colors, add structure, text in a bold type, provide example, variant with and without explanatory text, etc. The changes we made may seem subtle, but it was impossible to make more Changes we made may seem subtle, but a conceptual redesign was out of question due to several limitations imposed by the company. However, we feel that even with our ``subtle'' changes we were able to incorporate some traditional warning design features. Proposed variants 17 We used several good practises, such as adding a pictorial in contrast colors, add structure, text in a bold type, provide example, variant with and without explanatory text, etc. The changes we made may seem subtle, but it was impossible to make more Changes we made may seem subtle, but a conceptual redesign was out of question due to several limitations imposed by the company. However, we feel that even with our ``subtle'' changes we were able to incorporate some traditional warning design features. Beta testers: Detecting PUA across screens Average Original version PUA detection: The good choice Original screen New screen The order of options 83.9% 89.8% http://www.clker.com/cliparts/6/9/4/X/4/Y/correct-tick-hi.png à positive first à „enable“ better than „detect“ PUA detection: Does not matter Warnings Links Výřez obrazovky PUA detection: Does not matter Text structure Presence of „live grid“ PUA detection: The bad choice (but…) 74.5% Original screen New screens 72.8% 71.4% „full“ screen 72.4% Beta testers vs. standard users Výřez obrazovky Beta testers vs. standard users §Studie: https://www.youtube.com/watch?v=H7ik9SeAexQ § §Negligible differences in terms of HW and SW (platform (32/64-bit), CPU model, RAM size, OS version) §Differences in location § Beta testers vs. standard users §Small differences in gender and age §Beta testers slightly more skilled § §But overal, both sub-samples are quite similar (in ESET‘s case) A large-scale comparative study of beta testers and standard users: Case study of a software security system (Stavova, V., Dedkova, L., Ukrop, M., & Matyas, V.) published in Communications of the ACM Metody používané v HCI Metody: online nebo offline •Kvalitativní (rozhovory, fokusní skupiny ..) •Uživatelské testování •Kvantitativní (online surveys) •Experimentální metody (exp. design) •Obsahové analýzy •Pozorování •Diáře •Ekologické momentální hodnocení (EMA) •Technické metody •Smíšené metody Populace: kdo jsou naši uživatelé? •Kdo jsou naši uživatelé a proč chtějí SW používat? •Jak se k nim pro výzkum dostat? •Můžeme je zkoumat online? •Pro jaké typy výzkumů je vhodné online prostředí? Adobe Systems Ecological momentary assessment (EMA): example of research §Michaela Lebedikova, Martina Smahelova, David Smahel, Steriani Elavsky, Martin Tancos, Michal Tkaczyk, Jana Blahosova, Jaromir Plhak, Ondrej Sotolar, Michal Schejbal § §Interdisciplinary Research Team on Internet and Society §Masaryk University, Brno, Czech Republic 29 Hi, my name is Michaela Lebedikova and today I will present you how we are testing innovative methods in order to capture more precise data of childrens‘ and adolescents‘ technology usage. Background Adolescents and Smartphones •More screen time, internet usage, variability in apps, individualized use (Bedrosova et al., 2018) •Possible diverse impact Current Research •Self-report > not objective data! (Barbovschi, Green & Vandoninck, 2013) •Call for new methods (Holloway et al., 2013) •Ecological Momentary Assessment •Ecological validity, shortened recall bias •Temporal relationships •Different sources of variability •BUT: researcher-provided phones (i. e. Jensen et al. 2019) Currently, the majority of adolescents use their mobile phones as their first choice device to access the internet. That leads to more screen time, and variability in use, with a possible diverse impact on their well-being and development. Until few years ago, researchers primarily relied on self-report via surveys, focus groups or interviews to investigate adolescent behavior online. However, with new possibilites, such as the ecological momentary assessment that relies on intensive sampling few times a day for a brief period of time, it became possible to enhance ecological validity and shorten the recall bias. Further, it allows to see detailed unfolding of temporal relationships and distinguish Between different sources of variability. However, this method is mostly used in health and mood research, typically also on researcher provided phones, which means that current research cannot yet sample objective behavior. To overcome the self-report bias in reporting technology use, we propose following solution: combination of intensive sampling and objective smartphone data collection from participants own device with repeated measurement periods, that allows us to see participants objective smartphone behavior and their development over time. Background Project •Modelling the FUTURE: Understanding the Impact of Technology on Adolescent‘s Well-Being •How to best capture adolescents‘ technology usage? •Example: social media use Solution •Combine: •Intensive sampling (EMA) •Objective smartphone data collection •Participant‘s device! •Measurement-burst design (Sliwinski 2008) Currently, the majority of adolescents use their mobile phones as their first choice device to access the internet. That leads to more screen time, and variability in use, with a possible diverse impact on their well-being and development. Until few years ago, researchers primarily relied on self-report via surveys, focus groups or interviews to investigate adolescent behavior online. However, with new possibilites, such as the ecological momentary assessment that relies on intensive sampling few times a day for a brief period of time, it became possible to enhance ecological validity and shorten the recall bias. Further, it allows to see detailed unfolding of temporal relationships and distinguish Between different sources of variability. However, this method is mostly used in health and mood research, typically also on researcher provided phones, which means that current research cannot yet sample objective behavior. To overcome the self-report bias in reporting technology use, we propose following solution: combination of intensive sampling and objective smartphone data collection from participants own device with repeated measurement periods, that allows us to see participants objective smartphone behavior and their development over time. Our Study 32 IRTIS App Questionnaires 4 times per day Objective data Screenshots Smartphone App •3 data sources •Installed in participant‘s device via Google Play •Available on Android v5 Lollipop + Ethical Standards •Approval of all relevant ethical bodies & project lawyer •Standard data protection practices •Machine-learning-based anonymization software of screenshots and all third-party data (Sotolář, Plhák, Šmahel 2021) • • Conversations Our project, Modelling the FUTURE, is concerned with how technology impacts psychological, social and physical wellbeing of adolescents. We developed a research app for Android devices, that is available on Google Play, which combines three data sources: self-report questionnaires (in which we ask about sleep habits, affects or momentary stressors), objective smartphone data (such as the name of currently used application and time spent in it), and screenshots. Additionally, some participants also provided their Messenger conversation data. All data are being sent to encrypted servers of the university, and apart from adhering to standard ethical procedures, we developed a machine learning based anonymization software for all the textual data. Research Design 33 Each measurement burst: 15 days Entry questionnaire Output questionnaire 1. burst 5-6/2021 2. burst 9-10/2021 3. burst 1-2/2022 4. burst 4-5/2022 3 data sources Post-burst questionnaire Intensive sampling 15 th Our study takes one year. In the beginning, participants fill in an online entry questionnaire and download our application. We opted for the measurement burst design, where data collection periods are repeated over time. In our study, we have four bursts, each taking 15 days. For 14 days, the data from three sources are collected: EMA questionnaires four times a day on a semi-random basis, objective smartphone data and screenshots. On the fifteenth day, there is only one questionnaire summarizing the past fourteen days. At the end of the whole study participants fill in an online output questionnaire. #ECREA2021 34 Illustrative Example With this type of research design we could, for example, ask … ̶What is the impact of social media use on momentary affect? ̶Is the impact of social media on affect different for girls and boys? (between-person hypothesis) ̶Is the impact of social media on affect different on days when the use is higher than average? (within-person hypothesis) ̶ To explain the analytical advantages and the kinds of answers you can get with our research design, let me show you on an example research question. Lets imagine we want to figure out what is the impact of social media use on momentary affect. Our app logs time spent in every app use duration, so for each participant we can aggregate their daily social media use. Illustrative example: Analytical advantages 35 June 2021 N = 201 Compliance rate 73 % Example: Between-Person Variability in Positive Activated Affect subset of 13 people with data from evening EMA questionnaire I will illustrate on an example data from our first measurement burst that took place this year in June with two hundred and one participants and a questionnaire compliance rate of 73 %. These data are only a subset of 15 participants and please, do not treat them as results, these are just shown to illustrate my points. As I mentioned, one of the many advantages is the ability to monitor detailed temporal unfolding of variability. Here I have an example of variability in positive activated affect, which is one of the affective components we measure in the EMA questionnaires that asks whether are participants feeling happy or excited. If you remember, we wanted to see whether social media use impacts affect. Each line represents a participants‘ fluctuation over the 14 days, and the blue line shows mean of all participants, but you can see that no one looks like the mean. Lets assume that affect varies with social media use – the more use, the higher affect. The trajectories are however different – and that is not only because people are different – for example, they have different traits, but also because their behavior changes – for example, someone that is used to use social media a lot will have their parent restrict their access. In cross sectional data, you cannot disentangle these two sources of variability and as Brose and colleagues point out, these type of data are then not illustrative for any kind of variability. But in this type of design, you actually can distinguish between the sources, and also with very precise objective data. 36 June 2021 N = 201 Compliance rate Example: Within-Person Variability in Positive Activated Affect subset of 13 people with data from evening EMA questionnaire 73 % M = 50.43 (SD = 11.17) M = 50.21 (SD = 25.79) Illustrative example: Analytical advantages To further illustrate my point, lets have a look at these two participants. They have nearly the same average, however, that is not very representative of the individual trajectories Only by examining their trajectories we can see how they felt on specific days and whether these fluctuations had anything to do with their media use – for example, our participant 3282 might have had the highest affect on days with high social media use where they talked to their friends, followed by low affect in days in which they haven‘t had enough time on social media. As demonstrated, our data allow for detailed insights, coming in handy for different types of inquiries. Přidat SNS use 37 June 2021 N = 201 Compliance rate Example: Within-Person Variability in Positive Activated Affect subset of 13 people with data from evening EMA questionnaire 73 % M = 50.43 (SD = 11.17) M = 50.21 (SD = 25.79) Illustrative example: Analytical advantages To further illustrate my point, lets have a look at these two participants. They have nearly the same average, however, that is not very representative of the individual trajectories Only by examining their trajectories we can see how they felt on specific days and whether these fluctuations had anything to do with their media use – for example, our participant 3282 might have had the highest affect on days with high social media use where they talked to their friends, followed by low affect in days in which they haven‘t had enough time on social media. As demonstrated, our data allow for detailed insights, coming in handy for different types of inquiries. Přidat SNS use Feasible? Yes! 38 Intensive testing required •5 pilot studies (N=85), numerous in-house pilot tests However… Recruitment difficulties •Combination of recruitment agency and social media (low willingness to participate) Study protocol adjustments •Fitting questionnaire window to adjust participants‘ school attendance and hobbies Technical issues •Different devices, Google Play policy, questionnaire notifications Privacy concerns •Parents, participants, schools (research advertisement problem) With that said, we found this type of study is feasible with adolescents, however, we faced some serious challenges. First of all, our decision to collect data from participants own devices was the most challenging, as each device is different and we faced many technical difficulties. The biggest issue was with correct questionnaire notifications, as some phones, usually with asian origin such as xiaomi or huawei, did not notify correctly. We also had problems with Google Play policy, for example one time they suspended our app without notice until we made some changes in the data we collect. All of that equired intensive testing on multiple devices within our own team, with college students and then finally with potential participants – we had 5 pilot studies, each testing a new version of the app. For the main study, we faced low willingness to participant. Our recruitment agency was not able to find enough participants – we had to recruit some on our own through social media. Among reasons for non participantions were repeatedly privacy concerns. Finally, we had to adjust the time windows for questionnaires to fit participants schedules.