<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Publications &#8211; Digital Health Bonn</title>
	<atom:link href="https://digital-health-bonn.de/category/publications/feed/" rel="self" type="application/rss+xml" />
	<link>https://digital-health-bonn.de</link>
	<description>Working Group &#34;Personalized Digital Health and Telemedicine&#34;</description>
	<lastBuildDate>Mon, 09 Mar 2026 15:51:15 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Seven Contributions from Our Group at the DGfE 2026 Annual Meeting</title>
		<link>https://digital-health-bonn.de/seven-contributions-from-our-group-at-the-dgfe-2026-annual-meeting/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 09 Mar 2026 15:34:35 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[AI in epilepsy]]></category>
		<category><![CDATA[computer vision in healthcare]]></category>
		<category><![CDATA[contactless vital sign monitoring]]></category>
		<category><![CDATA[DGfE 2026]]></category>
		<category><![CDATA[digital biomarkers]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[eye tracking]]></category>
		<category><![CDATA[human pose estimation]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[seizure detection]]></category>
		<category><![CDATA[video-EEG monitoring]]></category>
		<category><![CDATA[wearable health monitoring]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=770</guid>

					<description><![CDATA[Researchers from the Personalized Digital Health and Telemedicine Group at the University Hospital Bonn (UKB) will present seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE) in Würzburg. The accepted contributions highlight our group’s [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Researchers from the <strong>Personalized Digital Health and Telemedicine Group</strong> at the <strong>University Hospital Bonn (UKB)</strong> will present <strong>seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE)</strong> in Würzburg.</p>



<p>The accepted contributions highlight our group’s interdisciplinary research at the intersection of <strong>artificial intelligence, digital health, computer vision, wearable sensing, and clinical epileptology</strong>. Our work focuses on the development of <strong>digital biomarkers and AI-based analysis methods</strong> to improve the monitoring, diagnosis, and understanding of epilepsy.</p>



<h3 class="wp-block-heading">Research Topics</h3>



<p>The presented studies cover several emerging directions in <strong>AI-driven epilepsy research</strong>, including:</p>



<ul class="wp-block-list">
<li><strong>Contactless measurement of vital parameters</strong> using camera-based sensing technologies</li>



<li><strong>3D human pose estimation for video-EEG monitoring</strong> to analyze seizure-related body movements</li>



<li><strong>Eye-tracking metrics for cognitive assessment in epilepsy</strong>, including contextualized analysis of visual search strategies</li>



<li><strong>Effects of anti-seizure medication on eye-tracking parameters</strong> in patients with epilepsy</li>



<li><strong>Eye tracking during the Trail Making Test</strong> and its association with depressive symptoms</li>



<li><strong>Secure and trustworthy AI models for seizure detection using wearable devices</strong></li>



<li><strong>Electrocardiographic changes under cenobamate therapy</strong> based on extended ECG recordings</li>
</ul>



<p>These contributions reflect ongoing collaborations between <strong>computer scientists, physicists, neurologists, and clinical researchers</strong> at the University Hospital Bonn and partner institutions.</p>



<p>A large part of the work has been carried out by <strong>PhD students and early-career researchers</strong>, demonstrating the strong integration of young scientists in our research activities.</p>



<p>We look forward to presenting our work in Würzburg and discussing these results with colleagues from the international <strong>epilepsy research and digital medicine community</strong>.</p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Simsek, Koray;  Müllers, Johannes;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Kontaktloses kamerabasiertes Messen von Vitalparametern <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_123" class="tp_show" onclick="teachpress_pub_showhide('123','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_123" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{simsek2026a,<br />
title = {Kontaktloses kamerabasiertes Messen von Vitalparametern},<br />
author = {Koray Simsek and Johannes Müllers and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('123','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Jansen, Anna;  Steininger, Melissa;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_124" class="tp_show" onclick="teachpress_pub_showhide('124','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_124" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{nokey,<br />
title = {Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen},<br />
author = {Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('124','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Greß, Hannah;  Daryakenari, Nazila Ahmadi;  Bungartz, Christian;  Viola, Felix;  Markwald, Marco;  Brüll, Gabriela;  Kumar, Uttam;  Ohm, Marc;  Surges, Rainer;  Meier, Michael;  Demidova, Elena;  Krüger, Björn</p><p class="tp_pub_title">Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_125" class="tp_show" onclick="teachpress_pub_showhide('125','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_125" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{gress2026a,<br />
title = {Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie},<br />
author = {Hannah Greß and Nazila Ahmadi Daryakenari and Christian Bungartz and Felix Viola and Marco Markwald and Gabriela Brüll and Uttam Kumar and Marc Ohm and Rainer Surges and Michael Meier and Elena Demidova and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('125','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Vetter, Jonas;  Müllers, Johannes;  Spurio, Federico;  Surges, Rainer;  Gall, Juergen;  Krüger, Björn</p><p class="tp_pub_title">Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_126" class="tp_show" onclick="teachpress_pub_showhide('126','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_126" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{vetter2026,<br />
title = {Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten},<br />
author = {Jonas Vetter and Johannes Müllers and Federico Spurio and Rainer Surges and Juergen Gall and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('126','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Pukropski, Jan;  Keßler, Lisa; von Wrede, Randi;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_127" class="tp_show" onclick="teachpress_pub_showhide('127','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_127" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{Pukropski2026a,<br />
title = {Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie},<br />
author = {Jan Pukropski and Lisa Keßler and Randi von Wrede and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('127','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Steininger, Melissa;  Jansen, Anna;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_128" class="tp_show" onclick="teachpress_pub_showhide('128','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_128" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{steininger2026c,<br />
title = {Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie},<br />
author = {Melissa Steininger and Anna Jansen and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('128','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Mustafa, Sarah Al-Haj;  Jansen, Anna;  Steininger, Melissa;  Müllers, Johannes;  Surges, Rainer;  Helmstaedter, Christoph;  Krüger, Björn; von Wrede, Randi</p><p class="tp_pub_title">Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_129" class="tp_show" onclick="teachpress_pub_showhide('129','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_129" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{mustafa2026a,<br />
title = {Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik},<br />
author = {Sarah Al-Haj Mustafa and Anna Jansen and Melissa Steininger and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Björn Krüger and Randi von Wrede},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('129','tp_bibtex')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New Publication: Activity-Based Emotion Detection Using Wearable Sensors and Transfer Learning</title>
		<link>https://digital-health-bonn.de/new-publication-activity-based-emotion-detection-using-wearable-sensors-and-transfer-learning/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Wed, 25 Feb 2026 15:40:24 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[activity recognition]]></category>
		<category><![CDATA[affective computing]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[emotion detection]]></category>
		<category><![CDATA[emotion recognition from movement]]></category>
		<category><![CDATA[human activity sensing]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[transfer learning]]></category>
		<category><![CDATA[wearable IoT]]></category>
		<category><![CDATA[wearable machine learning]]></category>
		<category><![CDATA[wearable sensors]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=773</guid>

					<description><![CDATA[Understanding human emotions from everyday behavior is an important goal in digital health, affective computing, and wearable sensing. In our latest publication in the IEEE Internet of Things Journal, we investigate how movement data from wearable sensors can be used [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Understanding human emotions from everyday behavior is an important goal in <strong>digital health, affective computing, and wearable sensing</strong>. In our latest publication in the <strong>IEEE Internet of Things Journal</strong>, we investigate how <strong>movement data from wearable sensors</strong> can be used to infer emotional states.</p>



<p>The paper <strong>“From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems”</strong> explores how <strong>machine learning models trained for activity recognition can be adapted to detect emotional states</strong>. Using a <strong>cross-domain transfer learning approach</strong>, the study demonstrates how knowledge learned from physical activity data can be transferred to emotion recognition tasks.</p>



<p>Wearable devices continuously capture <strong>movement patterns, step dynamics, and physical activity signals</strong>, which provide valuable information about human behavior in everyday environments. By leveraging these signals, the proposed approach opens new opportunities for <strong>non-invasive emotion monitoring using wearable IoT systems</strong>.</p>



<p>Such technologies have potential applications in <strong>mental health monitoring, digital phenotyping, and personalized healthcare</strong>, where unobtrusive sensing and intelligent data analysis can support early detection of behavioral and emotional changes.</p>



<p>This publication also continues a <strong>long-standing research collaboration between Qaiser Riaz and Björn Krüger</strong>, focusing on <strong>machine learning methods for human movement analysis and wearable sensor systems</strong>.</p>



<p>The full publication can be accessed via IEEE Xplore:<br><a href="https://ieeexplore.ieee.org/document/11404152">https://ieeexplore.ieee.org/document/11404152</a></p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_article"><div class="tp_pub_info"><p class="tp_pub_author"> Imran, Hamza Ali;  Riaz, Qaiser;  Hamza, Kiran;  Muhammad, Shaida;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('122','tp_links')" style="cursor:pointer;">From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems</a> <span class="tp_pub_type tp_  article">Journal Article</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_in">In: </span><span class="tp_pub_additional_journal">IEEE Internet of Things Journal, </span><span class="tp_pub_additional_pages">pp. 1-1, </span><span class="tp_pub_additional_year">2026</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_122" style="display:none;"><div class="tp_bibtex_entry"><pre>@article{Imran2026a,<br />
title = {From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems},<br />
author = {Hamza Ali Imran and Qaiser Riaz and Kiran Hamza and Shaida Muhammad and Björn Krüger},<br />
doi = {10.1109/JIOT.2026.3666469},<br />
year  = {2026},<br />
date = {2026-02-20},<br />
urldate = {2026-02-20},<br />
journal = {IEEE Internet of Things Journal},<br />
pages = {1-1},<br />
abstract = {Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {article}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_122" style="display:none;"><div class="tp_abstract_entry">Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_122" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1109/JIOT.2026.3666469" title="Follow DOI:10.1109/JIOT.2026.3666469" target="_blank">doi:10.1109/JIOT.2026.3666469</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Abstracts Accepted for International AI in Epilepsy Conference</title>
		<link>https://digital-health-bonn.de/abstracts-accepted-for-international-ai-in-epilepsy-conference/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 22 Dec 2025 16:14:09 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=711</guid>

					<description><![CDATA[We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the 4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders, which will take place in [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the <strong>4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders</strong>, which will take place in Puerto Rico from <strong>March 16–19, 2026</strong>.</p>



<p>The accepted contributions are:</p>



<ul class="wp-block-list">
<li><em>Linking Higher-Level Eye Tracking Metrics to High-Impact Antiseizure Medication in Epilepsy Patients</em></li>



<li><em>Higher-Level Eye Tracking Metrics Reveal Search Behaviour Differences in Persons with Epilepsy vs. Healthy Controls</em></li>
</ul>



<p>Anna Jansen and Melissa Steininger will present this work at the conference.</p>



<p>We look forward to continuing our research on digital health and AI in epilepsy in the coming year and wish everyone a restful holiday season.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>🎉 Exciting News to Close the Year! 🎉</title>
		<link>https://digital-health-bonn.de/%f0%9f%8e%89-exciting-news-to-close-the-year-%f0%9f%8e%89/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Thu, 19 Dec 2024 13:22:02 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=526</guid>

					<description><![CDATA[As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025! Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:✅ Digital Transformation of Blood [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025!</p>



<p>Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Digital Transformation of Blood Sample Management<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> EpiEye – Effects of Seizure-Suppressive Medications on Eye Movements and Autonomic Changes in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Establishing an Epileptological Teleconsultation Between Siegen Hospital and University Hospital Bonn<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> ANNE – Eye-Tracking for Detecting Side Effects in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Semiological Characteristics of Temporal Lobe Epilepsy with GAD65 Antibodies</p>



<p>Accepted Contributions – International Conference on Artificial Intelligence in Epilepsy and Other Neurological Disorders 2025:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Eye-Tracking Reveals Search Behaviour in Epilepsy Patients<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Prediction Models on Eye-Tracking Data in Epilepsy</p>



<p>Accepted Contribution – Wissenschaftlichen Tagung Autismus-Spektrum (WTAS)<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Multi-Stream Analysis for Robust Head Gesture Classification in Natural Social Interaction: Leveraging High-Resolution Sensor Data for Optimized Visual Classification</p>



<p>We sincerely thank all contributors for their dedication and hard work. We look forward to engaging discussions, inspiring exchanges, and exciting collaborations at these upcoming conferences!</p>



<p>Wishing everyone a successful and inspiring year ahead!</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Pre-Impact Fall Detection: Paper Accepted!</title>
		<link>https://digital-health-bonn.de/pre-impact-fall-detection-paper-accepted/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 10 Jun 2024 15:46:08 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=332</guid>

					<description><![CDATA[We are thrilled to announce that our latest research paper, &#8220;Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes,&#8221; has been accepted for publication in the IEEE Sensors Journal. Our study focuses on developing a deep classifier to [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are thrilled to announce that our latest research paper, &#8220;Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes,&#8221; has been accepted for publication in the IEEE Sensors Journal. Our study focuses on developing a deep classifier to detect falls in the pre-impact phase using wearable sensors, achieving outstanding accuracy and speed. This research has significant implications for improving the health and safety of the elderly and people with disabilities.</p>



<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><div class="tp_publication tp_publication_article"><div class="tp_pub_info"><p class="tp_pub_author"> Kiran, Samia;  Riaz, Qaiser;  Hussain, Mehdi;  Zeeshan, Muhammad;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('75','tp_links')" style="cursor:pointer;">Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes</a> <span class="tp_pub_type tp_  article">Journal Article</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_in">In: </span><span class="tp_pub_additional_journal">IEEE Sensors Journal, </span><span class="tp_pub_additional_volume">vol. 24, </span><span class="tp_pub_additional_number">no. 15, </span><span class="tp_pub_additional_pages">pp. 24086-24095, </span><span class="tp_pub_additional_year">2024</span>, <span class="tp_pub_additional_issn">ISSN: 1558-1748</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_75" style="display:none;"><div class="tp_bibtex_entry"><pre>@article{10552639,<br />
title = {Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes},<br />
author = {Samia Kiran and Qaiser Riaz and Mehdi Hussain and Muhammad Zeeshan and Björn Krüger},<br />
doi = {10.1109/JSEN.2024.3407835},<br />
issn = {1558-1748},<br />
year  = {2024},<br />
date = {2024-08-01},<br />
urldate = {2024-01-01},<br />
journal = {IEEE Sensors Journal},<br />
volume = {24},<br />
number = {15},<br />
pages = {24086-24095},<br />
abstract = {Falling poses a significant challenge to the health and well-being of the elderly and people with various disabilities. Precise and prompt fall detection plays a crucial role in preventing falls and mitigating the impact of injuries. In this research, we propose a deep classifier for pre-impact fall detection which can detect a fall in the pre-impact phase with an inference time of 46–52 milliseconds. The proposed classifier is an ensemble of Convolutional Neural Networks (CNNs) and Bidirectional Gated Recurrent Units (BiGRU) with residual connections. We validated the performance of the proposed classifier on a comprehensive, publicly available preimpact fall dataset. The dataset covers 36 diverse activities, including 15 types of fall-related activities and 21 types of activities of daily living (ADLs). Furthermore, we evaluated the proposed model using three different inputs of varying dimensions: 6D input (comprising 3D accelerations and 3D angular velocities), 3D input (3D accelerations), and 1D input (magnitude of 3D accelerations). The reduction in the input space from 6D to 1D is aimed at minimizing the computation cost. We have attained commendable results outperforming the state-of-the-art approaches by achieving an average accuracy and F1 score of 98% for 6D input size. The potential implications of this research are particularly relevant in the realm of smart healthcare, with a focus on the elderly and differently-abled population.},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {article}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_75" style="display:none;"><div class="tp_abstract_entry">Falling poses a significant challenge to the health and well-being of the elderly and people with various disabilities. Precise and prompt fall detection plays a crucial role in preventing falls and mitigating the impact of injuries. In this research, we propose a deep classifier for pre-impact fall detection which can detect a fall in the pre-impact phase with an inference time of 46–52 milliseconds. The proposed classifier is an ensemble of Convolutional Neural Networks (CNNs) and Bidirectional Gated Recurrent Units (BiGRU) with residual connections. We validated the performance of the proposed classifier on a comprehensive, publicly available preimpact fall dataset. The dataset covers 36 diverse activities, including 15 types of fall-related activities and 21 types of activities of daily living (ADLs). Furthermore, we evaluated the proposed model using three different inputs of varying dimensions: 6D input (comprising 3D accelerations and 3D angular velocities), 3D input (3D accelerations), and 1D input (magnitude of 3D accelerations). The reduction in the input space from 6D to 1D is aimed at minimizing the computation cost. We have attained commendable results outperforming the state-of-the-art approaches by achieving an average accuracy and F1 score of 98% for 6D input size. The potential implications of this research are particularly relevant in the realm of smart healthcare, with a focus on the elderly and differently-abled population.</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_75" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1109/JSEN.2024.3407835" title="Follow DOI:10.1109/JSEN.2024.3407835" target="_blank">doi:10.1109/JSEN.2024.3407835</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Poster at DGfE 2024</title>
		<link>https://digital-health-bonn.de/poster-at-dgfe-2024/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Tue, 27 Feb 2024 12:00:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=150</guid>

					<description><![CDATA[Our abstract &#8220;Sensorik am Krankenbett – Synchrone Datenakquise für Studien in der Epileptologie&#8221; will be presented as digital poster at the 62. Jahrestagung der Deutschen Gesellschaft für Epileptologie in June.]]></description>
										<content:encoded><![CDATA[
<p>Our abstract &#8220;Sensorik am Krankenbett – Synchrone Datenakquise für Studien in der Epileptologie&#8221; will be presented as digital poster at the 62. Jahrestagung der Deutschen Gesellschaft für Epileptologie in June.</p>



<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Müllers, Johannes;  Greß, Hannah;  Haaga, Lisa;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('65','tp_links')" style="cursor:pointer;">Sensorik am Krankenbett – Synchrone Datenakquise für Studien in der Epileptologie</a> <span class="tp_pub_type tp_  conference">Conference</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">Clinical Epileptology, </span><span class="tp_pub_additional_volume">vol. 37 (Suppl 1), </span><span class="tp_pub_additional_year">2024</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_65" class="tp_show" onclick="teachpress_pub_showhide('65','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_65" class="tp_show" onclick="teachpress_pub_showhide('65','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_65" class="tp_show" onclick="teachpress_pub_showhide('65','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_65" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{muellers2024,<br />
title = {Sensorik am Krankenbett – Synchrone Datenakquise für Studien in der Epileptologie},<br />
author = {Johannes Müllers and Hannah Greß and Lisa Haaga and Björn Krüger},<br />
doi = {10.1007/s10309-024-00672-x},<br />
year  = {2024},<br />
date = {2024-04-18},<br />
urldate = {2024-04-18},<br />
booktitle = {Clinical Epileptology},<br />
issuetitle = {Abstracts zur 62. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
volume = {37 (Suppl 1)},<br />
pages = {1–73},<br />
abstract = {Die Möglichkeit der Anfallserkennung oder -vorhersage außerhalb des Krankenhauses kann die Lebensqualität und das Sicherheitsbedürfnis von Epilepsiepatienten erhöhen. Die Überwachung von Vitalparametern, Bewegungen und weiteren Messgrößen kann von einer Vielzahl von Wearables oder sonstigen neuartigen Sensorsystemen gewährleistet werden. Videoüberwachte EEG-Messplätze dienen als Goldstandard und werden für Studien mit solchen Sensoren genutzt, um Korrelationen festzustellen. Hierbei stellen technische Herausforderungen ein wiederkehrendes Problem dar. Neben der Inbetriebnahme der Sensorsysteme, die ohne informationstechnische Kenntnisse oft nur mit proprietären Mitteln möglich ist, ist insbesondere die Synchronizität zur EEG-Aufzeichnung anspruchsvoll. Aktuelle Vorbereitungen einer Studie mit Eye-Tracker Brillen bieten den Anlass, ein neues System zur Datenakquisition aufzubauen. },<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('65','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_65" style="display:none;"><div class="tp_abstract_entry">Die Möglichkeit der Anfallserkennung oder -vorhersage außerhalb des Krankenhauses kann die Lebensqualität und das Sicherheitsbedürfnis von Epilepsiepatienten erhöhen. Die Überwachung von Vitalparametern, Bewegungen und weiteren Messgrößen kann von einer Vielzahl von Wearables oder sonstigen neuartigen Sensorsystemen gewährleistet werden. Videoüberwachte EEG-Messplätze dienen als Goldstandard und werden für Studien mit solchen Sensoren genutzt, um Korrelationen festzustellen. Hierbei stellen technische Herausforderungen ein wiederkehrendes Problem dar. Neben der Inbetriebnahme der Sensorsysteme, die ohne informationstechnische Kenntnisse oft nur mit proprietären Mitteln möglich ist, ist insbesondere die Synchronizität zur EEG-Aufzeichnung anspruchsvoll. Aktuelle Vorbereitungen einer Studie mit Eye-Tracker Brillen bieten den Anlass, ein neues System zur Datenakquisition aufzubauen. </div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('65','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_65" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1007/s10309-024-00672-x" title="Follow DOI:10.1007/s10309-024-00672-x" target="_blank">doi:10.1007/s10309-024-00672-x</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('65','tp_links')">Close</a></p></div></div></div></div></div>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Contribution accepted for the Bernstein Confernce 2023</title>
		<link>https://digital-health-bonn.de/diagnosing-rare-diseases-by-movement-primitive-based-classification-of-kinematic-gait-data/</link>
					<comments>https://digital-health-bonn.de/diagnosing-rare-diseases-by-movement-primitive-based-classification-of-kinematic-gait-data/#respond</comments>
		
		<dc:creator><![CDATA[Hannah Greß]]></dc:creator>
		<pubDate>Tue, 08 Aug 2023 14:29:19 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=61</guid>

					<description><![CDATA[Our abstract titled &#8220;Diagnosing Rare Diseases by Movement Primitive-Based Classification of Kinematic Gait Data&#8221; was accepted as poster presentation and will be presented by our collaboration partner Jing Xu from Marburg.]]></description>
										<content:encoded><![CDATA[
<p>Our abstract titled &#8220;Diagnosing Rare Diseases by Movement Primitive-Based Classification of Kinematic Gait Data&#8221; was accepted as poster presentation and will be presented by our collaboration partner Jing Xu from Marburg. </p>



<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><div class="tp_publication tp_publication_proceedings"><div class="tp_pub_info"><p class="tp_pub_author"> Xu, Jing;  Greß, Hannah;  Seefried, Sabine; van Drongelen, Stefan;  Schween, Raphael;  Sommer, Claudia;  Endres, Dominik;  Krüger, Björn;  Stief, Felix</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('60','tp_links')" style="cursor:pointer;">Diagnosing Rare Diseases by Movement Primitive-Based Classification of Kinematic Gait Data</a> <span class="tp_pub_type tp_  proceedings">Proceedings</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_howpublished">Bernstein Conference, </span><span class="tp_pub_additional_year">2023</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_60" class="tp_show" onclick="teachpress_pub_showhide('60','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_60" class="tp_show" onclick="teachpress_pub_showhide('60','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_60" class="tp_show" onclick="teachpress_pub_showhide('60','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_60" style="display:none;"><div class="tp_bibtex_entry"><pre>@proceedings{JingXu2023,<br />
title = {Diagnosing Rare Diseases by Movement Primitive-Based Classification of Kinematic Gait Data},<br />
author = {Jing Xu and Hannah Greß and Sabine Seefried and Stefan van Drongelen and Raphael Schween and Claudia Sommer and Dominik Endres and Björn Krüger and Felix Stief},<br />
url = {https://abstracts.g-node.org/conference/BC23/abstracts#/uuid/31c21041-91a0-46bd-87dc-46271501fdc0},<br />
doi = {10.12751/nncn.bc2023.313},<br />
year  = {2023},<br />
date = {2023-01-10},<br />
urldate = {2023-01-10},<br />
booktitle = { Bernstein Conference 2023},<br />
abstract = {Of over 6.000 known rare diseases, a considerable portion involves motor symptoms [1]. Whereas aiding diagnosis by artificial intelligence based on non-motor symptoms has shown promise [2], the potential of using movement data to this purpose has not yet been fully investigated. We therefore aim to implement a machine learning algorithm inspired by biological motor control to aid diagnosis of rare diseases by classifying data from standard kinematic clinical gait analysis.<br />
<br />
Starting from 42-degrees-of-freedom time series of joint angles extracted from motion capture data with custom routines [3], we employ a Gaussian process-based temporal movement primitive algorithm [4] in order to reduce the data to sets of movement primitives and weight vectors that capture the essential characteristics of the gait movement. The primitives are participant (and disease) -independent and represent general human gait. The weights are participant-specific and thus contain disease-specific information. A weighted combination of the primitives can thus generate participant specific gait data. We then apply standard classification tools such as Support Vector Machines and Random Forests to the weights to distinguish the disease from the control gait. The primary goal is to reliably differentiate patients from age-matched controls in an existing data set on patients with Legg–Calvé–Perthes disease (LCPD). A secondary goal is to allow the classifier to expand the set of diseases using nonparametric methods such as the Dirichlet process.<br />
<br />
Importantly, our movement primitive algorithm is inspired by current theories of biological motor control with a potential edge over standard algorithms in training on small case numbers. The temporal primitives are analogous to central pattern generators in the spinal cord [5], whereas the weights reflect activation of these central patterns by more central mechanisms in a hierarchical control scheme. In such a control scheme, disease-specific changes in weights may be caused directly by disease-specific influences on neural signaling, such as in the Stiff Person Syndrome [6], or indirectly through pain-avoidance in orthopedic conditions such as LCPD.<br />
<br />
With further development, our approach holds potential for facilitating early detection and improving treatment strategies across a wide range of rare movement disorders and orthopedic conditions.},<br />
howpublished = {Bernstein Conference},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {proceedings}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('60','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_60" style="display:none;"><div class="tp_abstract_entry">Of over 6.000 known rare diseases, a considerable portion involves motor symptoms [1]. Whereas aiding diagnosis by artificial intelligence based on non-motor symptoms has shown promise [2], the potential of using movement data to this purpose has not yet been fully investigated. We therefore aim to implement a machine learning algorithm inspired by biological motor control to aid diagnosis of rare diseases by classifying data from standard kinematic clinical gait analysis.<br />
<br />
Starting from 42-degrees-of-freedom time series of joint angles extracted from motion capture data with custom routines [3], we employ a Gaussian process-based temporal movement primitive algorithm [4] in order to reduce the data to sets of movement primitives and weight vectors that capture the essential characteristics of the gait movement. The primitives are participant (and disease) -independent and represent general human gait. The weights are participant-specific and thus contain disease-specific information. A weighted combination of the primitives can thus generate participant specific gait data. We then apply standard classification tools such as Support Vector Machines and Random Forests to the weights to distinguish the disease from the control gait. The primary goal is to reliably differentiate patients from age-matched controls in an existing data set on patients with Legg–Calvé–Perthes disease (LCPD). A secondary goal is to allow the classifier to expand the set of diseases using nonparametric methods such as the Dirichlet process.<br />
<br />
Importantly, our movement primitive algorithm is inspired by current theories of biological motor control with a potential edge over standard algorithms in training on small case numbers. The temporal primitives are analogous to central pattern generators in the spinal cord [5], whereas the weights reflect activation of these central patterns by more central mechanisms in a hierarchical control scheme. In such a control scheme, disease-specific changes in weights may be caused directly by disease-specific influences on neural signaling, such as in the Stiff Person Syndrome [6], or indirectly through pain-avoidance in orthopedic conditions such as LCPD.<br />
<br />
With further development, our approach holds potential for facilitating early detection and improving treatment strategies across a wide range of rare movement disorders and orthopedic conditions.</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('60','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_60" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="fas fa-globe"></i><a class="tp_pub_list" href="https://abstracts.g-node.org/conference/BC23/abstracts#/uuid/31c21041-91a0-46bd-87dc-46271501fdc0" title="https://abstracts.g-node.org/conference/BC23/abstracts#/uuid/31c21041-91a0-46bd-[...]" target="_blank">https://abstracts.g-node.org/conference/BC23/abstracts#/uuid/31c21041-91a0-46bd-[...]</a></li><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.12751/nncn.bc2023.313" title="Follow DOI:10.12751/nncn.bc2023.313" target="_blank">doi:10.12751/nncn.bc2023.313</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('60','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://digital-health-bonn.de/diagnosing-rare-diseases-by-movement-primitive-based-classification-of-kinematic-gait-data/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
