Emotional Keyboard: To Provide Adaptive Functionalities Based on the Current User Emotion and the Context

Wasura D. Wattearachchi, K.P. Hewagamage, Enosha Hettiarachchi

 

pp. 147 - 174, download

(https://doi.org/10.55612/s-5002-054-007)

 

 

Abstract

  

Improving the User Experience (UX) of mobile devices is of vital importance due to the advent of emerging technologies and the prevalence of using mobile devices. This research aims to develop a model for a mobile device that can suggest adaptive functionalities, based on the current user emotion and the context by changing the user’s negative emotions (sadness and anger) into positive ones. As a proof of concept, a keyboard named “Emotional Keyboard” was developed through five prototypes iteratively using Evolutionary Prototyping. Action Research was adopted as the methodology along with User-Centered Design (UCD) which further included two user surveys. The first three prototypes were implemented to decide the most optimal perception from facial expressions and text analytics. Subsequent prototypes provided affective functions to the user such as listening to music, playing a game, chat with friends, based on the detected negative emotion and the context. The evaluation of each of the prototypes was performed iteratively with user participation. The final (fifth) prototype evaluation was done in two phases, an individual analysis (to measure the performance of each user separately), and an overall analysis (a general analysis that averaged all the results from individual analysis and measured the performance of the overall model). Results of both analyses showed that eventually the Emotional Keyboard was able to predict the adaptive functions correctly to the user and it did not terminate its learning process where the users’ feedback was continuously used to improve its performance. In conclusion, an “Adaptive System with User Control” was developed thus improving the acceptability and usability of a mobile device which aligns with the research aim. 

 

Keywords: Human-Computer Interaction, Adaptive System, Emotion Detection, Facial Expressions, Text Analytics, Prototyping, User-Centered Design, Affective Computing, Mobile Computing

 

 

References

 

1. Forlizzi J., Battarbee K.: Understanding experience in interactive systems DIS2004 - Designing Interactive Systems: Across the Spectrum, pp. 261-268 (2004) https://doi.org/10.1145/1013115.1013152
2. Alben L.: Defining the criteria for effective interaction design Interactions, 3, pp. 11-15 (1996) https://doi.org/10.1145/235008.235010
3. Picard R.: Affective computing for HCI HCI, 1, pp. 829-833 (1999) https://dl.acm.org/doi/10.5555/647943.742338
4. Wattearachchi W.D., Hettiarachchi E., Hewagamage K.P.: Critical Success Factors of Analysing User Emotions to Improve the Usability of Systems 2019 Twelfth International Conference on Ubi-Media Computing (Ubi-Media). pp. 91-95 (2019) https://doi.org/10.1109/Ubi-Media.2019.00026
5. Picard R.: Toward Machines With Emotional Intelligence The Science of Emotional Intelligence : Knowns and Unknowns. pp. 29-30 (2004) https://doi.org/10.1093/acprof:oso/9780195181890.003.0016
6. Dalvand K., Kazemifard M.: An Adaptive User Interface Based on User's Emotion pp. 161-166 (2012) https://doi.org/10.1109/ICCKE.2012.6395371
7. Milasi R.M., Lucas C., Araabi B.N.: Intelligent modeling and control of washing machine using LLNF modeling and modified BELBIC 2005 International Conference on Control and Automation. vol. 2. pp. 812-817 Vol. 2 (2005) https://doi.org/10.1109/ICCA.2005.1528234
8. Filing S.I.: Situation-dependent metaphor for personal multimedia information 9, pp. 725-743 (1999) https://doi.org/10.1142/S0218194099000383
9. H. Lee, Y. Choi, Y. Kim: An Adaptive User Interface based on Spatiotemporal Structure Learning 2011 IEEE Consumer Communications and Networking Conference (CCNC). pp. 923-927 (2011) https://doi.org/10.1109/CCNC.2011.5766642
10. Afzal M., Ali R., Banos O., Hussain S., Ul Hassan A., Muhammad Bilal H.S., Hussain J., Lee S., Bang J.: Model-based adaptive user interface based on context and user experience evaluation Journal on Multimodal User Interfaces, 12, pp. 1-16 (2018) https://doi.org/10.1007/s12193-018-0258-2
11. Sebe N., Cohen I., Thomas S. Huang: Multimodal Emotion Recognition Handbook of Pattern Recognition and Computer Vision, 4, pp. 387-419 (2005) https://doi.org/10.1142/9789812775320_0021
12. Wang S.-M., Li C.-H., Lo Y.-C., Huang T.-H.K., Ku L.-W.: Sensing Emotions in Text Messages: An Application and Deployment Study of EmotionPush pp. 1-6 (2016) https://doi.org/10.48550/arXiv.1610.04758
13. Park J., Han S., Kim H.K., Cho Y., Park W.: Developing Elements of User Experience for Mobile Phones and Services: Survey, Interview, and Observation Approaches Human Factors and Ergonomics in Manufacturing & Service Industries, 23, (2013) https://doi.org/10.1002/hfm.20316
14. Hirakawa M., Hewagamage P., Ichikawa T.: Situation-dependent browser to explore the information space Proceedings. 1998 IEEE Symposium on Visual Languages (Cat. No.98TB100254), pp. 108-115 (1998) https://doi.org/10.1109/VL.1998.706153
15. Bellal Z., Elouali N., Benslimane S.M., Acarturk C.: Integrating Mobile Multimodal Interactions based on Programming By Demonstration International Journal of Human-Computer Interaction, 0, pp. 1-16 (2020) https://doi.org/10.1080/10447318.2020.1823688
16. Gurcan F., Cagiltay N.E., Cagiltay K.: Mapping Human-Computer Interaction Research Themes and Trends from Its Existence to Today: A Topic Modeling-Based Review of past 60 Years International Journal of Human-Computer Interaction, 0, pp. 1-14 (2020) https://doi.org/10.1080/10447318.2020.1819668
17. Hakak N.M., Mohd M., Kirmani M., Mohd M.: Emotion analysis: A survey 2017 International Conference on Computer, Communications and Electronics (COMPTELIX ). pp. 397-402 (2017) https://doi.org/10.1109/COMPTELIX.2017.8004002
18. Paul E.: Facial expression and Emotion American Pshychologist Association, 49, pp. 384-392 (1993) https://doi.org/10.1037/0003-066X.48.4.384
19. Plutchik R.: A psychoevolutionary theory of emotions Social Science Information, 21, pp. 529-553 (1982) https://doi.org/10.1177/053901882021004003
20. Kolodziej M., Majkowski A., Rak R.J., Tarnowski P., Pielaszkiewicz T.: Analysis of facial features for the use of emotion recognition Proceedings of 2018 19th International Conference Computational Problems of Electrical Engineering, CPEE 2018. pp. 1-4 (2018) https://doi.org/10.1109/CPEE.2018.8507137
21. Shikder R., Rahaman S., Afroze F., Al Islam A.B.M.A.: Keystroke/mouse usage based emotion detection and user identification 2017 International Conference on Networking, Systems and Security (NSysS), pp. 96-104 (2017) https://doi.org/10.1109/NSysS.2017.7885808
22. Lugovic S., Dunder I., Horvat M.: Techniques and applications of emotion recognition in speech 2016 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). pp. 1278-1283 (2016) https://doi.org/10.1109/MIPRO.2016.7522336
23. Corneanu C., Noroozi F., Kaminska D., Sapinski T., Escalera S., Anbarjafari G.: Survey on Emotional Body Gesture Recognition CoRR, abs/1801.0, (2018) https://doi.org/10.48550/arXiv.1801.07481
24. Porcu S., Floris A., Atzori L.: Towards the Prediction of the Quality of Experience from Facial Expression and Gaze Direction ICIN 2019 QoE Management Workshop, Paris. pp. 82-87 (2019) https://doi.org/10.1109/ICIN.2019.8685917
25. Khalili Z., Moradi M.H.: Emotion detection using brain and peripheral signals 2008 Cairo International Biomedical Engineering Conference, CIBEC 2008, pp. 2006-2009 (2008) https://doi.org/10.1109/CIBEC.2008.4786096
26. Mukeshimana M., Ban X., Karani N., Liu R.: Multimodal Emotion Recognition for Human-Computer Interaction: A Survey 8, pp. 1289-1301 (2017)
27. Dubey M., Singh L.: Automatic Emotion Recognition Using Facial Expression: A Review International Research Journal of Engineering and Technology (IRJET), 3, pp. 488-492 (2016)
28. Holland A.C., O'Connell G., Dziobek I.: Facial mimicry, empathy, and emotion recognition: a meta-analysis of correlations Cognition and Emotion, 0, pp. 1-19 (2020) https://doi.org/10.1080/02699931.2020.1815655
29. Landowska A., Brodny G., Wrobel M.R.: Limitations of emotion recognition from facial expressions in e-learning context CSEDU 2017 - Proceedings of the 9th International Conference on Computer Supported Education, 2, pp. 383-389 (2017) https://doi.org/10.5220/0006357903830389
30. Bostan L.A.M., Klinger R.: An Analysis of Annotated Corpora for Emotion Classification in Text Title and Abstract in German Proceedings of the 27th International Conference on Computational Linguistics. pp. 2104-2119 (2018)
31. Chaffar S., Inkpen D.: Using a Heterogeneous Dataset for Emotion Analysis in Text Springer, pp. 62-67 (2011) https://doi.org/10.1007/978-3-642-21043-3_8
32. Shivhare S.N., Khethawat S.: Emotion Detection from Text Computer Science & Information Technology, 2, (2012) https://doi.org/10.5121/csit.2012.2237
33. Garcia-Garcia J.M., Penichet V.M.R., Lozano M.D.: Emotion detection Proceedings of the XVIII International Conference on Human Computer Interaction - Interacción '17, pp. 1-8 (2017) https://doi.org/10.1145/3123818.3123852
34. Deshmukh R.S., Jagtap V.: A survey: Software API and database for emotion recognition 2017 International Conference on Intelligent Computing and Control Systems (ICICCS), pp. 284-289 (2018) https://doi.org/10.1109/ICCONS.2017.8250727
35. Pentel A.: Predicting Age and Gender by Keystroke Dynamics and Mouse Patterns pp. 381-385 (2017) https://doi.org/10.1145/3099023.3099105
36. Pentel A.: Predicting user age by keystroke dynamics Advances in Intelligent Systems and Computing, 764, pp. 336-343 (2019) https://doi.org/10.1007/978-3-319-91189-2_33
37. Fukazawa Y., Hara M., Onogi M., Ueno H.: Automatic cell phone menu customization based on user operation history MobileHCI '09 Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (2009) https://doi.org/10.1145/1613858.1613921
38. Benyon D., Innocent P., Murray D.: System Adaptivity and the Modelling of Stereotypes Human-Computer Interaction-INTERACT '87. pp. 245-253 (2014) https://doi.org/10.1016/B978-0-444-70304-0.50047-9
39. Rothrock L., Koubek R., Fuchs F., Haas M., Salvendy G.: Review and reappraisal of adaptive interfaces: Toward biologically inspired paradigms Theoretical Issues in Ergonomics Science, 3, pp. 47-84 (2002) https://doi.org/10.1080/14639220110110342
40. Gullà F., Ceccacci S., Germani M., Cavalieri L.: Design Adaptable and Adaptive User Interfaces: A Method to Manage the Information Design Adaptable and Adaptive User Interfaces : a Method to Manage the Information (2015) https://doi.org/10.1007/978-3-319-18374-9_5
41. Bhattacherjee A.: Social Science Research: Principles, Methods, and Practices, (2012)
42. Norman D.A., Draper S.W.: User Centered System Design; New Perspectives on Human-Computer Interaction, L. Erlbaum Associates Inc., Hillsdale, NJ, USA, (1986) https://doi.org/10.1201/b15703
43. Carr M., Verner J.: Prototyping and Software Development Approaches Prototyping and Software Development Approaches, pp. 319-338 (1997)
44. Steffens A.N. v, Langerhuizen D.W.G., Doornberg J.N., Ring D., Janssen S.J.: Emotional tones in scientific writing: comparison of commercially funded studies and non-commercially funded orthopedic studies Acta Orthopaedica, 92, pp. 240-243 (2021) https://doi.org/10.1080/17453674.2020.1853341
45. Dupré Damien AND Krumhuber E.G.A.N.D.K.D.A.N.D.M.G.J.: A performance comparison of eight commercially available automatic classifiers for facial affect recognition PLOS ONE, 15, pp. 1-17 (2020) https://doi.org/10.1371/journal.pone.0231968
46. Konyushkova K., Raphael S., Fua P.: Learning active learning from data Advances in Neural Information Processing Systems, 2017-Decem, pp. 4226-4236 (2017) https://dl.acm.org/doi/10.5555/3294996.3295177
47. Stephanidis C.: Designing for All in Ambient Intelligence Environments: The Interplay of User, Context, and Technology International Journal of Human-Computer Interaction, 25, pp. 441-454 (2009) https://doi.org/10.1080/10447310902865032 

 

 

back to Table of Contents

News