Melissa Steininger, M. Sc.
Personalized Digital Health and Telemedicine
Affiliation:
Department for Epileptology
University Hospital Bonn
Medical Faculty
University of Bonn
Location:
Venusberg-Campus 1,
Building 74, Room 2G-016
53127 Bonn, Germany
Telephone: +49-228/287-52171
Email: melissa.steininger@ukbonn.de

Short CV:
Melissa Steininger earned her Bachelor’s degree in Cognitive Science (B.Sc.) in 2020 from the University of Osnabrück and her Master’s degree in Visual Computing and Games Technology (M.Sc.) in 2023 from the University of Applied Sciences Bonn-Rhein-Sieg. She is now doing her Ph.D. in Computer Science at the University Hospital Bonn/University of Bonn.
Publications
2025
Jansen, Anna; Morev, Nikita; Steininger, Melissa; Müllers, Johannes; Krüger, Björn
Synthetic Hand Dataset Generation: Multi-View Rendering and Annotation with Blender Conference Forthcoming
Proceedings IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Forthcoming.
@conference{jansen2025c,
title = {Synthetic Hand Dataset Generation: Multi-View Rendering and Annotation with Blender},
author = {Anna Jansen and Nikita Morev and Melissa Steininger and Johannes Müllers and Björn Krüger},
year = {2025},
date = {2025-10-06},
booktitle = {Proceedings IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
abstract = {Pose estimation is a common method for precise handtracking, which is important for natural interaction in virtual reality (VR). However, training those models requires large-scale datasets with accurate 3D annotations. Those are difficult to obtain due to the time-consuming data collection and the limited variety in captured scenarios. We present a work-in-progress Blender-based pipeline for generating synthetic multi-view hand datasets. Our system simulates Ultraleap Stereo IR 170-style images and extracts joint positions directly from a rigged hand model, eliminating the need for manual labeling or external tracking processes. The current pipeline version supports randomized static poses with per-frame annotations of joint positions, camera parameters, and rendered images. While extended hand variation, animation features, and different sensor-type simulations are still in progress, our pipeline already provides a flexible foundation for customizable dataset generation and reproducible hand-tracking model training.},
keywords = {},
pubstate = {forthcoming},
tppubtype = {conference}
}
Steininger, Melissa; Marquardt, Alexander; Perusquía-Hernández, Monica; Lehnort, Marvin; Otsubo, Hiromu; Dollack, Felix; Kruijff, Ernst; Krüger, Björn; Kiyokawa, Kiyoshi; Riecke, Bernhard E.
The Awe-some Spectrum: Self-Reported Awe Varies by Eliciting Scenery and Presence in Virtual Reality, and the User's Nationality Proceedings Article Forthcoming
In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Forthcoming.
@inproceedings{steininger2025c,
title = {The Awe-some Spectrum: Self-Reported Awe Varies by Eliciting Scenery and Presence in Virtual Reality, and the User's Nationality},
author = {Melissa Steininger and Alexander Marquardt and Monica Perusquía-Hernández and Marvin Lehnort and Hiromu Otsubo and Felix Dollack and Ernst Kruijff and Björn Krüger and Kiyoshi Kiyokawa and Bernhard E. Riecke
},
year = {2025},
date = {2025-10-01},
booktitle = {IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
abstract = {Awe is a multifaceted emotion often associated with the perception of vastness, that challenges existing mental frameworks. Despite its growing relevance in affective computing and psychological research, awe remains difficult to elicit and measure.
This raises the research questions of how awe can be effectively elicited, which factors are associated with the experience of awe, and whether it can reliably be measured using biosensors.
For this study, we designed ten immersive Virtual Reality (VR) scenes with dynamic transitions from narrow to vast environments. These scenes were used to explore how awe relates to environmental features (abstract, human-made, nature), personality traits, and country of origin. We collected skin conductance, respiration data, and self-reported awe and presence from participants from Germany, Japan, and Jordan.
Our results indicate that self-reported awe varies significantly across countries and scene types. In particular, a scene depicting outer space elicited the strongest awe. Scenes that elicited high self-reported awe also induced a stronger sense of presence. However, we found no evidence that awe ratings are correlated with physiological responses.
These findings challenge the assumption that awe is reliably reflected in autonomic arousal and underscore the importance of cultural and perceptual context.
Our study offers new insights into how immersive VR can be designed to elicit awe, and suggests that subjective reports—rather than physiological signals—remain the most consistent indicators of emotional impact.},
keywords = {},
pubstate = {forthcoming},
tppubtype = {inproceedings}
}
This raises the research questions of how awe can be effectively elicited, which factors are associated with the experience of awe, and whether it can reliably be measured using biosensors.
For this study, we designed ten immersive Virtual Reality (VR) scenes with dynamic transitions from narrow to vast environments. These scenes were used to explore how awe relates to environmental features (abstract, human-made, nature), personality traits, and country of origin. We collected skin conductance, respiration data, and self-reported awe and presence from participants from Germany, Japan, and Jordan.
Our results indicate that self-reported awe varies significantly across countries and scene types. In particular, a scene depicting outer space elicited the strongest awe. Scenes that elicited high self-reported awe also induced a stronger sense of presence. However, we found no evidence that awe ratings are correlated with physiological responses.
These findings challenge the assumption that awe is reliably reflected in autonomic arousal and underscore the importance of cultural and perceptual context.
Our study offers new insights into how immersive VR can be designed to elicit awe, and suggests that subjective reports—rather than physiological signals—remain the most consistent indicators of emotional impact.
Alavi, Khashayar; Jansen, Anna; Steininger, Melissa; Mustafa, Sarah Al-Haj; Müllers, Johannes; Surges, Rainer; Helmstaedter, Christoph; von Wrede, Randi; Krüger, Björn
Graph Neural Networks for Analyzing Eye Fixation Patterns in Epilepsy Conference Forthcoming
International Congress on Mobile Health and Digital Technology in Epilepsy, Forthcoming.
@conference{alavi2025a,
title = {Graph Neural Networks for Analyzing Eye Fixation Patterns in Epilepsy},
author = {Khashayar Alavi and Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},
year = {2025},
date = {2025-09-04},
urldate = {2025-09-04},
booktitle = {International Congress on Mobile Health and Digital Technology in Epilepsy},
keywords = {},
pubstate = {forthcoming},
tppubtype = {conference}
}
Jansen, Anna; Steininger, Melissa; Mustafa, Sarah Al-Haj; Müllers, Johannes; Surges, Rainer; Helmstaedter, Christoph; von Wrede, Randi; Krüger, Björn
Search Behavior – Metrics for Analysis of Eye Tracking Data Conference Forthcoming
International Congress on Mobile Health and Digital Technology in Epilepsy, Forthcoming.
@conference{jansen2025b,
title = {Search Behavior – Metrics for Analysis of Eye Tracking Data},
author = {Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},
year = {2025},
date = {2025-09-04},
urldate = {2025-09-04},
booktitle = {International Congress on Mobile Health and Digital Technology in Epilepsy},
journal = {International Congress on Mobile Health and Digital Technology in Epilepsy},
keywords = {},
pubstate = {forthcoming},
tppubtype = {conference}
}
Mustafa, Sarah Al-Haj; Jansen, Anna; Steininger, Melissa; Müllers, Johannes; Surges, Rainer; von Wrede, Randi; Krüger, Björn; Helmstaedter, Christoph
Eyes on Cognition: Exploring Oculomotor Correlates of Cognitive Function in Patients with Epilepsy Journal Article
In: Epilepsy & Behavior, vol. 173, iss. December 2025, no. 110562, 2025.
@article{alhaj2025,
title = {Eyes on Cognition: Exploring Oculomotor Correlates of Cognitive Function in Patients with Epilepsy},
author = {Sarah Al-Haj Mustafa and Anna Jansen and Melissa Steininger and Johannes Müllers and Rainer Surges and Randi von Wrede and Björn Krüger and Christoph Helmstaedter},
doi = {10.1016/j.yebeh.2025.110562},
year = {2025},
date = {2025-06-30},
urldate = {2025-06-30},
journal = {Epilepsy & Behavior},
volume = {173},
number = {110562},
issue = {December 2025},
abstract = {Objective
This study investigates the relationship between eye tracking parameters and cognitive performance during the Trail Making Test (TMT) in individuals with epilepsy and healthy controls. By analyzing ocular behaviors such as saccade velocity, fixation duration, and pupil diameter, we aim to determine how these metrics reflect executive functioning and attentional control.
Methods
A sample of 95 participants with epilepsy and 34 healthy controls completed the TMT while their eye movements were recorded. Partial correlations, controlling for age, sex, education, medication count, seizure status and epilepsy duration, examined associations between eye tracking measures and cognitive performance derived from EpiTrack and TMT performance.
Results
In the patient group, faster TMT-A performance was associated with shorter fix- ation durations (r = 0.31, p = 0.006). Lower minimum saccade velocity correlated with slower performance on both TMT-A (r = −0.35, p = 0.002) and TMT-B (r = −0.40, p<0.001), whereas higher peak saccade velocities were linked to worse performance (TMT-A: r = 0.45, p<0.001; TMT-B: r = 0.41, p<0.001). Pupil diameter findings indicated that slower TMT performance was associated with smaller minimum pupil sizes (r = −0.23 to r = −0.36), wich may indicate increased cognitive effort and attentional load. Higher EpiTrack scores also correlated with a smaller minimum pupil diameter − but only during the more demanding TMT-B − and with a more restricted saccade velocity range, reflecting greater motor control and attentional stability. No significant correlations emerged within the control group.
Conclusion
These findings highlight the potential of eye tracking as a non-invasive tool for assessing cognitive function in epilepsy. Efficient cognitive performance was characterized by stable and controlled eye movements, whereas impaired performance involved erratic saccade dynamics and prolonged fixations. Importantly, eye tracking parameters provide additional information beyond simple speed measurements, potentially enhancing the differential diagnostic capabilities of the TMT in epilepsy. The observed associations between oculomotor parameters and cognitive performance were not present in the control group, suggesting that these relationships may be specific to epilepsy. Future research should investigate whether both basic and advanced metrics of search strategies are sensitive to disease dynamics and treatment effects in epilepsy.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
This study investigates the relationship between eye tracking parameters and cognitive performance during the Trail Making Test (TMT) in individuals with epilepsy and healthy controls. By analyzing ocular behaviors such as saccade velocity, fixation duration, and pupil diameter, we aim to determine how these metrics reflect executive functioning and attentional control.
Methods
A sample of 95 participants with epilepsy and 34 healthy controls completed the TMT while their eye movements were recorded. Partial correlations, controlling for age, sex, education, medication count, seizure status and epilepsy duration, examined associations between eye tracking measures and cognitive performance derived from EpiTrack and TMT performance.
Results
In the patient group, faster TMT-A performance was associated with shorter fix- ation durations (r = 0.31, p = 0.006). Lower minimum saccade velocity correlated with slower performance on both TMT-A (r = −0.35, p = 0.002) and TMT-B (r = −0.40, p<0.001), whereas higher peak saccade velocities were linked to worse performance (TMT-A: r = 0.45, p<0.001; TMT-B: r = 0.41, p<0.001). Pupil diameter findings indicated that slower TMT performance was associated with smaller minimum pupil sizes (r = −0.23 to r = −0.36), wich may indicate increased cognitive effort and attentional load. Higher EpiTrack scores also correlated with a smaller minimum pupil diameter − but only during the more demanding TMT-B − and with a more restricted saccade velocity range, reflecting greater motor control and attentional stability. No significant correlations emerged within the control group.
Conclusion
These findings highlight the potential of eye tracking as a non-invasive tool for assessing cognitive function in epilepsy. Efficient cognitive performance was characterized by stable and controlled eye movements, whereas impaired performance involved erratic saccade dynamics and prolonged fixations. Importantly, eye tracking parameters provide additional information beyond simple speed measurements, potentially enhancing the differential diagnostic capabilities of the TMT in epilepsy. The observed associations between oculomotor parameters and cognitive performance were not present in the control group, suggesting that these relationships may be specific to epilepsy. Future research should investigate whether both basic and advanced metrics of search strategies are sensitive to disease dynamics and treatment effects in epilepsy.
Haaga, Lisa; Jansen, Anna; Steininger, Melissa; Müllers, Johannes; Bausch, Marcel; Jordan, Arthur; Surges, Rainer; Krüger, Björn
EpiEye – Einfluss anfallssupressiver Medikamente auf Augenbewegungen und autonome Veränderungen bei Epilepsien Conference
Dreiländertagung Epilepsie 2025, 2025.
@conference{haaga2025a,
title = {EpiEye – Einfluss anfallssupressiver Medikamente auf Augenbewegungen und autonome Veränderungen bei Epilepsien},
author = {Lisa Haaga and Anna Jansen and Melissa Steininger and Johannes Müllers and Marcel Bausch and Arthur Jordan and Rainer Surges and Björn Krüger},
year = {2025},
date = {2025-03-26},
booktitle = {Dreiländertagung Epilepsie 2025},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Steininger, Melissa; Jansen, Anna; Welle, Kristian; Krüger, Björn
Optimized Sensor Position Detection: Improving Visual Sensor Setups for Hand Tracking in VR Conference
2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR), 2025.
@conference{steininger2025b,
title = {Optimized Sensor Position Detection: Improving Visual Sensor Setups for Hand Tracking in VR},
author = {Melissa Steininger and Anna Jansen and Kristian Welle and Björn Krüger},
year = {2025},
date = {2025-03-12},
urldate = {2025-03-12},
booktitle = {2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR)},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Steininger, Melissa; Jansen, Anna; Mustafa, Sarah Al-Haj; Surges, Rainer; Helmstaedter, Christoph; von Wrede, Randi; Krüger, Björn
Eye-Tracking Reveals Search Behaviour in Epilepsy Patients Conference
3rd International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders, 2025.
@conference{steininger2025a,
title = {Eye-Tracking Reveals Search Behaviour in Epilepsy Patients},
author = {Melissa Steininger and Anna Jansen and Sarah Al-Haj Mustafa and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger
},
year = {2025},
date = {2025-03-03},
urldate = {2025-03-03},
booktitle = {3rd International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders},
journal = {3rd International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Jansen, Anna; Steininger, Melissa; Mustafa, Sarah Al-Haj; Müllers, Johannes; Surges, Rainer; Helmstaedter, Christoph; von Wrede, Randi; Krüger, Björn
Prediction Models on Eye Tracking Data in Epilepsy Conference
3rd International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders, 2025.
@conference{jansen2025a,
title = {Prediction Models on Eye Tracking Data in Epilepsy},
author = {Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},
year = {2025},
date = {2025-03-03},
urldate = {2025-03-03},
booktitle = {3rd International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
2024
Otsubo, Hiromu; Lehnort, Marvin; Steininger, Melissa; Marquardt, Alexander; Dollack, Felix; Hirao, Yutaro; Perusquía-Hernández, Monica; Uchiyama, Hideaki; Kruijff, Ernst; Riecke, Bernhard; Kiyokawa, Kiyoshi
First-Person Perspective Induces Stronger Feelings of Awe and Presence Compared to Third-Person Perspective in Virtual Reality Proceedings Article
In: ICMI '24: Proceedings of the 26th International Conference on Multimodal Interaction, pp. 439 - 448, 2024.
@inproceedings{Otsubo2024,
title = {First-Person Perspective Induces Stronger Feelings of Awe and Presence Compared to Third-Person Perspective in Virtual Reality},
author = {Hiromu Otsubo and Marvin Lehnort and Melissa Steininger and Alexander Marquardt and Felix Dollack and Yutaro Hirao and Monica Perusquía-Hernández and Hideaki Uchiyama and Ernst Kruijff and Bernhard Riecke and Kiyoshi Kiyokawa},
doi = {https://doi.org/10.1145/3678957.3685753},
year = {2024},
date = {2024-11-04},
urldate = {2024-11-04},
booktitle = {ICMI '24: Proceedings of the 26th International Conference on Multimodal Interaction},
pages = {439 - 448},
abstract = {Awe is a complex emotion described as a perception of vastness and a need for accommodation to integrate new, overwhelming experiences. Virtual Reality (VR) has recently gained attention as a convenient means to facilitate experiences of awe. In VR, a first-person perspective might increase awe due to its immersive nature, while a third-person perspective might enhance the perception of vastness. However, the impact of VR perspectives on experiencing awe has not been thoroughly examined. We created two types of VR scenes: one with elements designed to induce high awe, such as a snowy mountain, and a low awe scene without such elements. We compared first-person and third-person perspectives in each scene. Forty-two participants explored the VR scenes, with their physiological responses captured by electrocardiogram (ECG) and face tracking (FT). Subsequently, participants self-reported their experience of awe (AWE-S) and presence (IPQ) within VR. The results revealed that the first-person perspective induced stronger feelings of awe and presence than the third-person perspective. The findings of this study provide useful guidelines for designing VR content that enhances emotional experiences.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Marquardt, Alexander; Lehnort, Marvin; Otsubo, Hiromu; Perusquia-Hernandez, Monica; Steininger, Melissa; Dollack, Felix; Uchiyama, Hideaki; Kiyokawa, Kiyoshi; Kruijff, Ernst
Exploring Gesture Interaction in Underwater Virtual Reality Proceedings Article
In: Proceedings of the 2024 ACM Symposium on Spatial User Interaction, pp. 1-2, 2024.
@inproceedings{marquardtLehnort2024,
title = {Exploring Gesture Interaction in Underwater Virtual Reality},
author = {Alexander Marquardt and Marvin Lehnort and Hiromu Otsubo and Monica Perusquia-Hernandez and Melissa Steininger and Felix Dollack and Hideaki Uchiyama and Kiyoshi Kiyokawa and Ernst Kruijff},
doi = {https://doi.org/10.1145/3677386.3688890},
year = {2024},
date = {2024-10-07},
urldate = {2024-10-07},
booktitle = {Proceedings of the 2024 ACM Symposium on Spatial User Interaction},
pages = {1-2},
abstract = {An underwater virtual reality (UVR) system with gesture-based controls was developed to facilitate navigation and interaction while submerged. The system uses a waterproof head-mounted display and camera-based gesture recognition, originally trained for abovewater conditions, employing three gestures: grab for navigation, pinch for single interactions, and point for continuous interactions. In an experimental study, we tested gesture recognition both above and underwater, and evaluated participant interaction within an immersive underwater scene. Results showed that underwater conditions slightly affected gesture accuracy, but the system maintained high performance. Participants reported a strong sense of presence and found the gestures intuitive while highlighting the need for further refinement to address usability challenges.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Steininger, Melissa; Perusquıa-Hernández, Monica; Marquardt, Alexander; Otsubo, Hiromu; Lehnort, Marvin; Dollack, Felix; Kiyokawa, Kiyoshi; Kruijff, Ernst; Riecke, Bernhard
Using Skin Conductance to Predict Awe and Perceived Vastness in Virtual Reality Conference
12th International Conference on Affective Computing and Intelligent Interaction, 2024.
@conference{steininger2024,
title = {Using Skin Conductance to Predict Awe and Perceived Vastness in Virtual Reality},
author = {Melissa Steininger and Monica Perusquıa-Hernández and Alexander Marquardt and Hiromu Otsubo and Marvin Lehnort and Felix Dollack and Kiyoshi Kiyokawa and Ernst Kruijff and Bernhard Riecke},
year = {2024},
date = {2024-09-17},
urldate = {2024-09-17},
booktitle = {12th International Conference on Affective Computing and Intelligent Interaction},
abstract = {Awe is an emotion characterized by the perception of vastness and the need to accommodate this vastness into one’s mental framework. We propose an elicitation scene to induce awe in Virtual Reality (VR), validate it through selfreport, and explore the feasibility of using skin conductance to predict self-reported awe and vastness as labeled from the stimuli in VR. Sixty-two participants took part in the study comparing the awe-eliciting space scene and a neutral scene. The space scene was confirmed as more awe-eliciting. A k-nearest neighbor algorithm confirmed high and low-awe score clusters used to label the data. A Random Forest algorithm achieved 65% accuracy (F1= 0.56, AU C= 0.73) when predicting the self-reported low and high awe categories from continuous skin conductance data. A similar approach achieved 55% accuracy (F1= 0.59, AU C= 0.56) when predicting the perception of vastness. These results underscore the potential of skinconductance-based algorithms to predict awe.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Marquardt, Alexander; Steininger, Melissa; Trepkowski, Christina; Weier, Martin; Kruijff, Ernst
Selection Performance and Reliability of Eye and Head Gaze Tracking Under Varying Light Conditions Conference
2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR), 2024.
@conference{marquardt2024,
title = {Selection Performance and Reliability of Eye and Head Gaze Tracking Under Varying Light Conditions},
author = {Alexander Marquardt and Melissa Steininger and Christina Trepkowski and Martin Weier and Ernst Kruijff},
doi = {10.1109/VR58804.2024.00075},
year = {2024},
date = {2024-04-15},
urldate = {2024-04-15},
booktitle = {2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)},
abstract = {Augmented Reality (AR) applications increasingly rely on eye and head gaze tracking for user interaction, with their efficacy influenced by environmental factors such as spatial arrangements and lighting conditions. This paper presents two studies that examine how these variables affect the performance of eye and head gaze tracking in AR environments. While eye tracking partially delivered faster results, its performance exhibited greater variability, especially under dynamic lighting conditions. Conversely, head gaze tracking, while providing more consistent results, showed a notable reduction in accuracy in environments with fluctuating light levels. Furthermore, the spatial properties of the environment had notable implications on both tracking methods. Our research demonstrates that both spatial properties and lighting conditions are key determinants in the choice of a tracking method, underscoring the need for AR systems that can dynamically adapt to these environmental variables.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}