Assessing and Training Socio-Emotional Skills

Abstract

Based on long-standing experience of the group in research on cognitive aging and its neuronal correlates, as well as with individuals experiencing handicaps of social communication, this project develops and evaluates assessment tools, perceptual and cognitive training programs, and tailor-made interventions to improve various aspects of social interaction. Our approach is based on current neurocognitive models of social perception and interaction. Selected sub-projects use current technology to synthesize naturalistic facial and vocal stimuli with paremeter-specific morphing methods. This technology allows us to create stimuli with augmented („caricatured“) social signals which have been demonstrated to be efficient in improving social perception. Individual aspects of this research programme include (1) an assessment of emotion perception abilities in hearing-impaired individuals with a cochlear implant, (2) the development and evaluation of a training program for improving nonverbal vocal communication in older adults, (3) a systematic assessment of the potential of mu-rhythm neurofeedback training to improve socio-emotional communication and its related cortical correlates in adolescents and young adults with autism, (4) the development of improved methods for assessing central auditory processing disorders (CAPD; dt: AVWS) – a frequent but incompletely understood cause of learning problems for school children, and (5) the development of new diagnostic tools. The subprojects are all characterized by use of state-of-the-art digital technology to assess and improve social interaction abilities.

People

Selected Relevant Publications

Limbach, K., Itz, M.L., Schweinberger, S.R., Jentsch, A.D., Romanova, L., & Kaufmann, J.M. (2022). Neurocognitive effects of a training program for poor face recognizers using shape and texture caricatures: A pilot investigation. Neuropsychologia, 165, 108133. (Link to PDF)

Kowallik, A., Pohl, M., & Schweinberger, S.R. (2021). Facial imitation improves emotion recognition in adults with and without sub-clinical autistic traits. Journal of Intelligence, 9(1), 4. (Special Issue: Advances in Socio-Emotional Ability Research; Guest editors: K. Schlegel and S. Olderbak). (Link to PDF)

Nussbaum, C., & Schweinberger, S.R. (2021). Links between musicality and vocal emotion perception. Emotion Review. (Link to PDF)

Schweinberger, S.R., von Eiff, C.I., Kirchen, L., Oberhoffner, T., Guntinas-Lichius, O., Dobel, C., Nussbaum, C., Zäske, R., &; Skuk, V.G. (2020). The Role of Stimulus Type and Social Signal for Voice Perception in Cochlear Implant Users: Response to the Letter by Meister H et al. Journal of Speech, Language, and Hearing Research, 63(12), 4327-4328. (Link to PDF)

Zäske, R., Skuk, V.G., Golle, J., & Schweinberger, S.R.  (2020). The Jena Speaker Set (JESS) – A database of voice stimuli from unfamiliar young and old adult speakers. Behavior Research Methods, 52, 990-1007. (Link to PDF)

Kowallik, A.E., & Schweinberger, S.R. (2019). Sensor-based Technology for Social Information Processing in Autism: A Review. Sensors, 19, 4787. (Link to PDF)