Affective gaming
Affective gaming is where the player's current emotional state is used to manipulate game-play.[1]
Affective gaming can be instrumented through biofeedback, it is although important to note that simply using for instance the flexing of a muscle as a way of controller the character is merely biofeedback. In order to make a game affective, one has to utilize the bio-metric data in order to try to understand and interpret the players emotional states, feelings and reactions.
History
2003
The first publications on affective gaming were done in 2003.[2][3]
Allanson, Dix and Gilleade (2005)
Allonson, Dix and Gilleade defined the design heuristics: assist me, challenge me and emote me, which they believed could help the designers of affective games.[1]
Assist me: Measuring the users frustration in combination with the game context, makes us able to identify problematic situations and adjust aspects of the accordingly. Assist me is about assisting the user, for instance when the user gets frustrated as he finds himself stuck within the game, the game could give him a clue of how to solve the problem.
Challenge me: Measuring the users engagement through arousal, to dynamically change the challenge of the game. Most games provide difficulty levels, for instance easy, medium and hard, which doesn't always cover the users actually skill and experience. By changing the game dynamically with the challenge me concept, the game can challenge the players on a more individually tailored level.
Emote me: Many games seek to create reactions and emote the user, but the user might get used to the attempts to create an reaction, and the timing might not always be optimal. Measuring the users actual emotional state, in order to find out to create the reactions and emote the user at the best possible time. For instance launching the event of a jump scare, when the user is most receptive to getting scared.
Champion and Dekker (2007)
Champion and Dekker made an example out of Half-Life 2, using biofeedback to change aspects of the game.[4]
They used a biosensor to capture the ECG HRV (Electrocardiogram Heart rate Variability) and GSR (Galvanic Skin Response) of a player in real-time. Although those two measurements doesn't give a complete bio-metric analysis, it does provide insight into the player's reaction to specific game events.
The aspects they changed within the game, based on the bio-metric data combined with predefined base levels.
- Based on heart-rate
- The characters speed
- Screen-shake
- Shader effects
- Red filter (excited/stressed)
- White and black filter (calm)
- Based on skin response
- The volume of the environment
- The gravitiy and density of the environment
- Semi-transparent environment
- Based on both
- Weapon damage
Kalyn, Lough, Mandry and Nacke (2011)
Kalyn, Lough, Mandryk and Nacke utilized both indirect and direct physiological control / bio-feed in order to enhance game interaction.[5]
To get all the inputs they attached various sensors to their body and used that as a means to control the game (video). The aspects they used biofeedback to change within the game:
- Enemy Target Size
- Respiratory rates (C1)
- Skin conductance (C2)
- Length of the flame (Flamethrower)
- Skin conductance (C1)
- Respiratory rates (C2)
- Speed and jump height
- Heart-beat (C1)
- Muscle contraction (C2)
- Weather condition and boss speed (final boss)
- Body temperature (C1)
- Heart-beat (C2)
- Medusa's gaze (Eye tracker, froze enemies and moving platforms)
- Gaze (C1)
- Gaze (C2)
C1 are the first conditions, and C2 the second.
They found that direct physiological functions can be used as a means of replace traditional controller input, but it should reflect an action in the real world. Indirect physiological control are appropriately used indirectly, to change the environment and similar aspects.
Torres (2013)
Torres made his Masters thesis on "Development of Biofeedback Mechanisms in a Procedural Environment Using Biometric Sensors", building on the papers of the authors of the previous papers mentioned in this section. Where he looked at how to utilize the biofeedback, utilizing both indirect and direct biofeedback appropriately.[6]
The E² is a biofeedback game development standard, framework that was created, and named The Emotion Engine (E²). Not to be confused with the Sony CPU: Emotion Engine.
E² was applied in a survival-horror game VANISH.
E² - The Emotion Engine
The emotion engine uses different types of indirect biofeedback, to change the game:
Visible Indirect Biofeedback (V-IBF) - Adjusting aspects of the game which are noticeable by the player.
Non-Visible Indirect Biofeedback (NV-IBF) - Adjusting aspects of the game which are not noticeable by the player.
Emotional Regulation Biofeedback (ERB) - Studies the player through the game, and uses the information to dynamically trigger/adjust specific game mechanics
Potential applications
Biofeedback can be utilized for competitive games, e-sports and casting of e-sports. With affective biofeedback the e-sports players can see when they need to calm down, or when another team member needs help as they might be "stressing out". It can also help the e-sports players and teams to analyze their performance, and it provides deeper analysis opportunities for casters.
Many games today changes aspects of the story based on choices and decisions you make during the game, so one could incorporate the biofeedback mechanisms in order to tailor the game also based on the players emotional states and reactions to events within the game. In combination with VR this could create immersive and even more personalized game-play than previously.
The technology can also contribute to various aspects within e-health systems and contribute to big data projects.
Problems
For biofeedback devices it could be problematic for game developers to utilize the sensors if there were a sudden boom in devices that incorporated different kinds of sensors and methods for reading the data. If there were 50 different devices, with different sensors and different data, it would be difficult for the game developers to incorporate 50 different standards.
See also
- Affective computing
- Affective video games
- Wearables and biofeedback devices shares some similarities and applications
- Valve on Biofeedback in Gameplay
References
- 1 2 Allanson, J., Dix, Alan., Gilleade, K. (2005) Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me (PDF) DiGRA
- ↑ Sykes, Jonathan; Brown, Simon (2003). "Affective gaming: measuring emotion through the gamepad" (PDF). CHI 2003: New Horizons: 732–733. doi:10.1145/765891.765957. Retrieved 23 November 2016.
- ↑ Gilleade, Kiel; Allanson, J. (2003). "A Toolkit to Explore Affective Interface Adaptation in Videogames". Proceedings of HCI International. 2: 370–374.
- ↑ Champion, E., Dekker, A. (2007) Please Biofeed the Zombies: Enhancing the Gameplay and Display of a Horror Game Using Biofeedback (PDF) DiGRA
- ↑ Kalyn, M., Lough, C., Mandryk, R., Nacke, L. (2011) Biofeedback Game Design: Using Direct and Indirect Physiological Control to Enhance Game Interaction (PDF) University of Saskatchewan
- ↑ Torres, V. (2013) Development of Biofeedback Mechanisms in a Procedural Environment Using Biometric Sensors (PDF) University of Porto