11-13 déc. 2024 Lyon (France)

Recherche par auteur > White Tristan

“Proof-of-concept” results on the use of webcam-based facial expression trackers for online studies interested in the influence of affective states on behaviour.
Tristan White  1, 2  , Jacqueline Scholl  3  , Nils Kolling  1  
1 : Stem Cell and Brain Research Institute (SBRI)
SBRI – STEM-CELL
2 : PSYR2, Centre de Recherche en Neurosciences de Lyon (CRNL)
PSYR2 - Psychiatric Disorders: from Resistance to Response Team
3 : PSYR2, Centre de Recherche en Neurosciences de Lyon (CRNL)
CRNL- PsyR2

An individual's behaviour is shaped by their internal state (e.g., at the time of decision), an essential component of which is the affective state (encompassing so-called “emotions” and “mood”). Complex behaviours, such as foraging or goal-pursuit, involve sequential decision-making over time. As time passes, individuals' internal states will fluctuate, impacting behaviour. Despite its significance, the impact of this type of intra-individual variability (e.g., fluctuating emotions across a behavioural experiment) on decision-making during complex behaviours remains understudied. This research aims to address this gap by exploring how measures of fluctuating affective states relate to decision-making in an incremental goal-pursuit task. Here, we present facial expression data quality assessments and "proof-of-concept" results, alongside preliminary findings on the relationships between participants' facial expressions, self-reported emotions, and behaviour.

Fifty-two online participants completed an incremental goal-pursuit task (in the form of a fishing game, adapted from Holton et al., 2024) using their own computer. In addition to behavioural data, we continuously collected facial expression data throughout the task using a pre-trained, webcam-based classifier. Self-reported emotions were also recorded at the end of each task block. To evaluate the classifier's accuracy, participants were asked to produce specific facial expressions both before and after the task. We analysed the classifier data at key moments in the task (e.g., at decision points and outcomes) and compared mean classifier outputs across blocks to the self-reported emotions. Finally, regression analyses are performed to explore the relationships between facial expressions, self-reports, and task behaviour.

Regression analysis show that across participants, facial expression data significantly reflects the pose prompts given during quality check phases before and after the task. The classifier's accuracy during this phase was significantly correlated with its sampling rate (reflecting participants' hardware), the number of sampling drops (reflecting failed face detection), and other factors. Analyses in progress include comparisons of facial expressions with self-report emotion ratings and task event-related changes in facial expressions.

If validated in a larger sample, these findings suggest webcam-based facial-expression classifiers could provide a reliable means of continuously monitoring facial expressions in online studies. This approach has promising implications for research that benefits from real-time tracking of affective state fluctuations, especially in tasks involving temporally extended behaviours.

 

References:

Holton, E., Grohn, J., Ward, H., Manohar, S. G., O'Reilly, J. X., & Kolling, N. (2024). Goal commitment is supported by vmPFC through selective attention. Nature Human Behaviour. https://doi.org/10.1038/s41562-024-01844-5


Personnes connectées : 4 Vie privée | Accessibilité
Chargement...