<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>digital health &#8211; Digital Health Bonn</title>
	<atom:link href="https://digital-health-bonn.de/tag/digital-health/feed/" rel="self" type="application/rss+xml" />
	<link>https://digital-health-bonn.de</link>
	<description>Working Group &#34;Personalized Digital Health and Telemedicine&#34;</description>
	<lastBuildDate>Mon, 09 Mar 2026 15:51:15 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Seven Contributions from Our Group at the DGfE 2026 Annual Meeting</title>
		<link>https://digital-health-bonn.de/seven-contributions-from-our-group-at-the-dgfe-2026-annual-meeting/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 09 Mar 2026 15:34:35 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[AI in epilepsy]]></category>
		<category><![CDATA[computer vision in healthcare]]></category>
		<category><![CDATA[contactless vital sign monitoring]]></category>
		<category><![CDATA[DGfE 2026]]></category>
		<category><![CDATA[digital biomarkers]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[eye tracking]]></category>
		<category><![CDATA[human pose estimation]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[seizure detection]]></category>
		<category><![CDATA[video-EEG monitoring]]></category>
		<category><![CDATA[wearable health monitoring]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=770</guid>

					<description><![CDATA[Researchers from the Personalized Digital Health and Telemedicine Group at the University Hospital Bonn (UKB) will present seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE) in Würzburg. The accepted contributions highlight our group’s [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Researchers from the <strong>Personalized Digital Health and Telemedicine Group</strong> at the <strong>University Hospital Bonn (UKB)</strong> will present <strong>seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE)</strong> in Würzburg.</p>



<p>The accepted contributions highlight our group’s interdisciplinary research at the intersection of <strong>artificial intelligence, digital health, computer vision, wearable sensing, and clinical epileptology</strong>. Our work focuses on the development of <strong>digital biomarkers and AI-based analysis methods</strong> to improve the monitoring, diagnosis, and understanding of epilepsy.</p>



<h3 class="wp-block-heading">Research Topics</h3>



<p>The presented studies cover several emerging directions in <strong>AI-driven epilepsy research</strong>, including:</p>



<ul class="wp-block-list">
<li><strong>Contactless measurement of vital parameters</strong> using camera-based sensing technologies</li>



<li><strong>3D human pose estimation for video-EEG monitoring</strong> to analyze seizure-related body movements</li>



<li><strong>Eye-tracking metrics for cognitive assessment in epilepsy</strong>, including contextualized analysis of visual search strategies</li>



<li><strong>Effects of anti-seizure medication on eye-tracking parameters</strong> in patients with epilepsy</li>



<li><strong>Eye tracking during the Trail Making Test</strong> and its association with depressive symptoms</li>



<li><strong>Secure and trustworthy AI models for seizure detection using wearable devices</strong></li>



<li><strong>Electrocardiographic changes under cenobamate therapy</strong> based on extended ECG recordings</li>
</ul>



<p>These contributions reflect ongoing collaborations between <strong>computer scientists, physicists, neurologists, and clinical researchers</strong> at the University Hospital Bonn and partner institutions.</p>



<p>A large part of the work has been carried out by <strong>PhD students and early-career researchers</strong>, demonstrating the strong integration of young scientists in our research activities.</p>



<p>We look forward to presenting our work in Würzburg and discussing these results with colleagues from the international <strong>epilepsy research and digital medicine community</strong>.</p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Simsek, Koray;  Müllers, Johannes;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Kontaktloses kamerabasiertes Messen von Vitalparametern <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_123" class="tp_show" onclick="teachpress_pub_showhide('123','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_123" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{simsek2026a,<br />
title = {Kontaktloses kamerabasiertes Messen von Vitalparametern},<br />
author = {Koray Simsek and Johannes Müllers and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('123','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Jansen, Anna;  Steininger, Melissa;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_124" class="tp_show" onclick="teachpress_pub_showhide('124','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_124" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{nokey,<br />
title = {Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen},<br />
author = {Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('124','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Greß, Hannah;  Daryakenari, Nazila Ahmadi;  Bungartz, Christian;  Viola, Felix;  Markwald, Marco;  Brüll, Gabriela;  Kumar, Uttam;  Ohm, Marc;  Surges, Rainer;  Meier, Michael;  Demidova, Elena;  Krüger, Björn</p><p class="tp_pub_title">Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_125" class="tp_show" onclick="teachpress_pub_showhide('125','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_125" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{gress2026a,<br />
title = {Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie},<br />
author = {Hannah Greß and Nazila Ahmadi Daryakenari and Christian Bungartz and Felix Viola and Marco Markwald and Gabriela Brüll and Uttam Kumar and Marc Ohm and Rainer Surges and Michael Meier and Elena Demidova and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('125','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Vetter, Jonas;  Müllers, Johannes;  Spurio, Federico;  Surges, Rainer;  Gall, Juergen;  Krüger, Björn</p><p class="tp_pub_title">Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_126" class="tp_show" onclick="teachpress_pub_showhide('126','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_126" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{vetter2026,<br />
title = {Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten},<br />
author = {Jonas Vetter and Johannes Müllers and Federico Spurio and Rainer Surges and Juergen Gall and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('126','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Pukropski, Jan;  Keßler, Lisa; von Wrede, Randi;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_127" class="tp_show" onclick="teachpress_pub_showhide('127','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_127" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{Pukropski2026a,<br />
title = {Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie},<br />
author = {Jan Pukropski and Lisa Keßler and Randi von Wrede and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('127','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Steininger, Melissa;  Jansen, Anna;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_128" class="tp_show" onclick="teachpress_pub_showhide('128','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_128" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{steininger2026c,<br />
title = {Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie},<br />
author = {Melissa Steininger and Anna Jansen and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('128','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Mustafa, Sarah Al-Haj;  Jansen, Anna;  Steininger, Melissa;  Müllers, Johannes;  Surges, Rainer;  Helmstaedter, Christoph;  Krüger, Björn; von Wrede, Randi</p><p class="tp_pub_title">Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_129" class="tp_show" onclick="teachpress_pub_showhide('129','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_129" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{mustafa2026a,<br />
title = {Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik},<br />
author = {Sarah Al-Haj Mustafa and Anna Jansen and Melissa Steininger and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Björn Krüger and Randi von Wrede},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('129','tp_bibtex')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New Publication: Activity-Based Emotion Detection Using Wearable Sensors and Transfer Learning</title>
		<link>https://digital-health-bonn.de/new-publication-activity-based-emotion-detection-using-wearable-sensors-and-transfer-learning/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Wed, 25 Feb 2026 15:40:24 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[activity recognition]]></category>
		<category><![CDATA[affective computing]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[emotion detection]]></category>
		<category><![CDATA[emotion recognition from movement]]></category>
		<category><![CDATA[human activity sensing]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[transfer learning]]></category>
		<category><![CDATA[wearable IoT]]></category>
		<category><![CDATA[wearable machine learning]]></category>
		<category><![CDATA[wearable sensors]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=773</guid>

					<description><![CDATA[Understanding human emotions from everyday behavior is an important goal in digital health, affective computing, and wearable sensing. In our latest publication in the IEEE Internet of Things Journal, we investigate how movement data from wearable sensors can be used [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Understanding human emotions from everyday behavior is an important goal in <strong>digital health, affective computing, and wearable sensing</strong>. In our latest publication in the <strong>IEEE Internet of Things Journal</strong>, we investigate how <strong>movement data from wearable sensors</strong> can be used to infer emotional states.</p>



<p>The paper <strong>“From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems”</strong> explores how <strong>machine learning models trained for activity recognition can be adapted to detect emotional states</strong>. Using a <strong>cross-domain transfer learning approach</strong>, the study demonstrates how knowledge learned from physical activity data can be transferred to emotion recognition tasks.</p>



<p>Wearable devices continuously capture <strong>movement patterns, step dynamics, and physical activity signals</strong>, which provide valuable information about human behavior in everyday environments. By leveraging these signals, the proposed approach opens new opportunities for <strong>non-invasive emotion monitoring using wearable IoT systems</strong>.</p>



<p>Such technologies have potential applications in <strong>mental health monitoring, digital phenotyping, and personalized healthcare</strong>, where unobtrusive sensing and intelligent data analysis can support early detection of behavioral and emotional changes.</p>



<p>This publication also continues a <strong>long-standing research collaboration between Qaiser Riaz and Björn Krüger</strong>, focusing on <strong>machine learning methods for human movement analysis and wearable sensor systems</strong>.</p>



<p>The full publication can be accessed via IEEE Xplore:<br><a href="https://ieeexplore.ieee.org/document/11404152">https://ieeexplore.ieee.org/document/11404152</a></p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_article"><div class="tp_pub_info"><p class="tp_pub_author"> Imran, Hamza Ali;  Riaz, Qaiser;  Hamza, Kiran;  Muhammad, Shaida;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('122','tp_links')" style="cursor:pointer;">From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems</a> <span class="tp_pub_type tp_  article">Journal Article</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_in">In: </span><span class="tp_pub_additional_journal">IEEE Internet of Things Journal, </span><span class="tp_pub_additional_pages">pp. 1-1, </span><span class="tp_pub_additional_year">2026</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_122" style="display:none;"><div class="tp_bibtex_entry"><pre>@article{Imran2026a,<br />
title = {From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems},<br />
author = {Hamza Ali Imran and Qaiser Riaz and Kiran Hamza and Shaida Muhammad and Björn Krüger},<br />
doi = {10.1109/JIOT.2026.3666469},<br />
year  = {2026},<br />
date = {2026-02-20},<br />
urldate = {2026-02-20},<br />
journal = {IEEE Internet of Things Journal},<br />
pages = {1-1},<br />
abstract = {Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {article}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_122" style="display:none;"><div class="tp_abstract_entry">Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_122" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1109/JIOT.2026.3666469" title="Follow DOI:10.1109/JIOT.2026.3666469" target="_blank">doi:10.1109/JIOT.2026.3666469</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Abstracts Accepted for International AI in Epilepsy Conference</title>
		<link>https://digital-health-bonn.de/abstracts-accepted-for-international-ai-in-epilepsy-conference/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 22 Dec 2025 16:14:09 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=711</guid>

					<description><![CDATA[We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the 4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders, which will take place in [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the <strong>4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders</strong>, which will take place in Puerto Rico from <strong>March 16–19, 2026</strong>.</p>



<p>The accepted contributions are:</p>



<ul class="wp-block-list">
<li><em>Linking Higher-Level Eye Tracking Metrics to High-Impact Antiseizure Medication in Epilepsy Patients</em></li>



<li><em>Higher-Level Eye Tracking Metrics Reveal Search Behaviour Differences in Persons with Epilepsy vs. Healthy Controls</em></li>
</ul>



<p>Anna Jansen and Melissa Steininger will present this work at the conference.</p>



<p>We look forward to continuing our research on digital health and AI in epilepsy in the coming year and wish everyone a restful holiday season.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The VIRTOSHA Project at MEDICA 2025</title>
		<link>https://digital-health-bonn.de/the-virtosha-project-at-medica-2025/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Wed, 19 Nov 2025 07:15:02 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[medica]]></category>
		<category><![CDATA[virtosha]]></category>
		<category><![CDATA[virtual reality]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=669</guid>

					<description><![CDATA[Yesterday, Anna Jansen and Kristian Welle presented our VIRTOSHA project in the Innovation#Area at the NRW state booth at MEDICA. In a short talk, Kristian outlined the project’s goals and technical approach. A particular highlight was the drilling simulation demo [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Yesterday, Anna Jansen and Kristian Welle presented our <a href="https://digital-health-bonn.de/virtosha/" data-type="page" data-id="109">VIRTOSHA </a>project in the Innovation#Area at the NRW state booth at MEDICA.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="2190" height="1643" data-id="673" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2.webp" alt="" class="wp-image-673" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2.webp 2190w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2-300x225.webp 300w" sizes="(max-width: 2190px) 100vw, 2190px" /></figure>



<figure class="wp-block-image size-large"><img decoding="async" width="768" height="1024" data-id="670" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-768x1024.webp" alt="" class="wp-image-670" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-768x1024.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-225x300.webp 225w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1.webp 1232w" sizes="(max-width: 768px) 100vw, 768px" /></figure>



<figure class="wp-block-image size-large"><img decoding="async" width="768" height="1024" data-id="671" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-768x1024.webp" alt="" class="wp-image-671" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-768x1024.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-225x300.webp 225w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-1152x1536.webp 1152w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3.webp 1349w" sizes="(max-width: 768px) 100vw, 768px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="672" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1024x768.webp" alt="" class="wp-image-672" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1024x768.webp 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-300x225.webp 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-768x576.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-360x270.webp 360w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p>In a short talk, Kristian outlined the project’s goals and technical approach. A particular highlight was the drilling simulation demo with a haptic arm, which provided highly realistic tactile feedback and sparked strong interest among visitors.</p>



<p>VIRTOSHA fits seamlessly into the HealthTech.NRW focus area of high-tech surgery – an environment where robots, lasers, and VR are already part of everyday clinical practice in NRW, significantly enhancing patient safety and treatment quality.</p>



<p>Many thanks to all visitors for the inspiring conversations and positive feedback!</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Panel “AI for Health and Health for AI”</title>
		<link>https://digital-health-bonn.de/panel-ai-for-health-and-health-for-ai/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Tue, 28 Oct 2025 10:52:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[conference]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=676</guid>

					<description><![CDATA[Prof. Björn Krüger was invited to participate in the panel “AI for Health and Health for AI” at the University of Saskatchewan. In his talk, “AI for Health: From Targeted Algorithms to Real-Life Impact,” he presented current research from the [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Prof. Björn Krüger was invited to participate in the panel <em>“AI for Health and Health for AI”</em> at the University of Saskatchewan. In his talk, <strong>“AI for Health: From Targeted Algorithms to Real-Life Impact,”</strong> he presented current research from the Department of Epileptology at the University Hospital Bonn, where his group develops AI models to understand human motion and cognition in real-world healthcare contexts.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="678" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1024x768.jpeg" alt="" class="wp-image-678" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1024x768.jpeg 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-300x225.jpeg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-768x576.jpeg 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1536x1152.jpeg 1536w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-360x270.jpeg 360w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793.jpeg 1696w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="677" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-1024x768.jpeg" alt="" class="wp-image-677" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-1024x768.jpeg 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-300x225.jpeg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-768x576.jpeg 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-360x270.jpeg 360w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939.jpeg 2000w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p>His contribution highlighted ongoing projects ranging from wearable-based motion analysis—using lightweight models for activity and emotion recognition—to eye-tracking studies in epilepsy that investigate how cognitive changes and medication side effects manifest in eye movements.</p>



<p>The event offered a valuable opportunity for exchange with fellow panelists <strong>Stephen Lee</strong> (College of Medicine, University of Saskatchewan) and <strong>Raymond Ng</strong> (Data Science Institute / AI &amp; Health Network, University of British Columbia) and <strong>Daniel Fuller</strong> (University of Saskatchewan) who did an excellent job in moderation and organization.</p>



<p>The discussion underscored a central theme of his work: <strong>AI in healthcare is not only about accuracy—its true impact lies in improving quality of life, supporting clinicians, and empowering patients.</strong></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>🎉 Exciting News to Close the Year! 🎉</title>
		<link>https://digital-health-bonn.de/%f0%9f%8e%89-exciting-news-to-close-the-year-%f0%9f%8e%89/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Thu, 19 Dec 2024 13:22:02 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=526</guid>

					<description><![CDATA[As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025! Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:✅ Digital Transformation of Blood [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025!</p>



<p>Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Digital Transformation of Blood Sample Management<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> EpiEye – Effects of Seizure-Suppressive Medications on Eye Movements and Autonomic Changes in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Establishing an Epileptological Teleconsultation Between Siegen Hospital and University Hospital Bonn<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> ANNE – Eye-Tracking for Detecting Side Effects in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Semiological Characteristics of Temporal Lobe Epilepsy with GAD65 Antibodies</p>



<p>Accepted Contributions – International Conference on Artificial Intelligence in Epilepsy and Other Neurological Disorders 2025:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Eye-Tracking Reveals Search Behaviour in Epilepsy Patients<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Prediction Models on Eye-Tracking Data in Epilepsy</p>



<p>Accepted Contribution – Wissenschaftlichen Tagung Autismus-Spektrum (WTAS)<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Multi-Stream Analysis for Robust Head Gesture Classification in Natural Social Interaction: Leveraging High-Resolution Sensor Data for Optimized Visual Classification</p>



<p>We sincerely thank all contributors for their dedication and hard work. We look forward to engaging discussions, inspiring exchanges, and exciting collaborations at these upcoming conferences!</p>



<p>Wishing everyone a successful and inspiring year ahead!</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Kickoff Meeting for VIRTOSHA Project</title>
		<link>https://digital-health-bonn.de/kickoff-meeting-for-virtosha-project/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 08 Jul 2024 21:16:57 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[project]]></category>
		<category><![CDATA[virtual reality]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=362</guid>

					<description><![CDATA[We are excited to announce that our groundbreaking research project, VIRTOSHA (an authoring tool for Virtual Reality training simulations of osteosynthesis with haptic feedback and tissue simulation), has officially started! 🎉 On July 5th, 2024, we held the kickoff meeting, [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are excited to announce that our groundbreaking research project, VIRTOSHA (an authoring tool for Virtual Reality training simulations of osteosynthesis with haptic feedback and tissue simulation), has officially started! <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f389.png" alt="🎉" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p>On July 5th, 2024, we held the kickoff meeting, marking the beginning of a journey that aims to revolutionize surgical training. Collaborating with our partners from <a href="https://cg.web.th-koeln.de/">TH Köln</a>, <a href="https://www.mindport.co/">Mindport GmbH</a> and <a href="https://www.haption.de/">Haption</a>, our goal is to provide surgeons with the highest quality VR training. This innovative approach seeks to replace traditional medical preparations (the technical term for prepared dead body parts) with advanced virtual reality simulations. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2620.png" alt="☠" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p>The kickoff meeting brought together all project partners for the first time, allowing the entire team to meet and get acquainted in person. The synergy and enthusiasm in the room were palpable, reinforcing that we have indeed found the perfect collaborators for this ambitious endeavor. Over the next three years, we are committed to overcoming the academic and technical challenges that lie ahead to create a significant real-world impact.</p>



<p>For more information and to stay updated on our progress, please visit our <a href="https://digital-health-bonn.de/virtosha/" data-type="page" data-id="109">project page</a>.</p>



<figure class="wp-block-gallery has-nested-images columns-3 is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-medium"><img loading="lazy" decoding="async" width="300" height="240" data-id="364" src="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_7678b456_free-300x240.jpg" alt="" class="wp-image-364" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_7678b456_free-300x240.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_7678b456_free-768x615.jpg 768w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_7678b456_free.jpg 953w" sizes="auto, (max-width: 300px) 100vw, 300px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="735" data-id="363" src="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_55cdabf9_free-1024x735.jpg" alt="" class="wp-image-363" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_55cdabf9_free-1024x735.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_55cdabf9_free-300x215.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_55cdabf9_free-768x551.jpg 768w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.31_55cdabf9_free.jpg 1317w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="797" height="1024" data-id="366" src="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe-797x1024.jpg" alt="" class="wp-image-366" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe-797x1024.jpg 797w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe-1195x1536.jpg 1195w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe-233x300.jpg 233w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe-768x987.jpg 768w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.33.15_720ab8fe.jpg 1593w" sizes="auto, (max-width: 797px) 100vw, 797px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="615" data-id="365" src="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free-1024x615.jpg" alt="" class="wp-image-365" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free-1024x615.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free-1536x923.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free-300x180.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free-768x461.jpg 768w, https://digital-health-bonn.de/wp-content/uploads/2024/07/WhatsApp-Bild-2024-07-03-um-16.20.32_0c0730c7_free.jpg 1568w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Pre-Impact Fall Detection: Paper Accepted!</title>
		<link>https://digital-health-bonn.de/pre-impact-fall-detection-paper-accepted/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 10 Jun 2024 15:46:08 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=332</guid>

					<description><![CDATA[We are thrilled to announce that our latest research paper, &#8220;Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes,&#8221; has been accepted for publication in the IEEE Sensors Journal. Our study focuses on developing a deep classifier to [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are thrilled to announce that our latest research paper, &#8220;Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes,&#8221; has been accepted for publication in the IEEE Sensors Journal. Our study focuses on developing a deep classifier to detect falls in the pre-impact phase using wearable sensors, achieving outstanding accuracy and speed. This research has significant implications for improving the health and safety of the elderly and people with disabilities.</p>



<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><div class="tp_publication tp_publication_article"><div class="tp_pub_info"><p class="tp_pub_author"> Kiran, Samia;  Riaz, Qaiser;  Hussain, Mehdi;  Zeeshan, Muhammad;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('75','tp_links')" style="cursor:pointer;">Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes</a> <span class="tp_pub_type tp_  article">Journal Article</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_in">In: </span><span class="tp_pub_additional_journal">IEEE Sensors Journal, </span><span class="tp_pub_additional_volume">vol. 24, </span><span class="tp_pub_additional_number">no. 15, </span><span class="tp_pub_additional_pages">pp. 24086-24095, </span><span class="tp_pub_additional_year">2024</span>, <span class="tp_pub_additional_issn">ISSN: 1558-1748</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_75" class="tp_show" onclick="teachpress_pub_showhide('75','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_75" style="display:none;"><div class="tp_bibtex_entry"><pre>@article{10552639,<br />
title = {Unveiling Fall Origins: Leveraging Wearable Sensors to Detect Pre-Impact Fall Causes},<br />
author = {Samia Kiran and Qaiser Riaz and Mehdi Hussain and Muhammad Zeeshan and Björn Krüger},<br />
doi = {10.1109/JSEN.2024.3407835},<br />
issn = {1558-1748},<br />
year  = {2024},<br />
date = {2024-08-01},<br />
urldate = {2024-01-01},<br />
journal = {IEEE Sensors Journal},<br />
volume = {24},<br />
number = {15},<br />
pages = {24086-24095},<br />
abstract = {Falling poses a significant challenge to the health and well-being of the elderly and people with various disabilities. Precise and prompt fall detection plays a crucial role in preventing falls and mitigating the impact of injuries. In this research, we propose a deep classifier for pre-impact fall detection which can detect a fall in the pre-impact phase with an inference time of 46–52 milliseconds. The proposed classifier is an ensemble of Convolutional Neural Networks (CNNs) and Bidirectional Gated Recurrent Units (BiGRU) with residual connections. We validated the performance of the proposed classifier on a comprehensive, publicly available preimpact fall dataset. The dataset covers 36 diverse activities, including 15 types of fall-related activities and 21 types of activities of daily living (ADLs). Furthermore, we evaluated the proposed model using three different inputs of varying dimensions: 6D input (comprising 3D accelerations and 3D angular velocities), 3D input (3D accelerations), and 1D input (magnitude of 3D accelerations). The reduction in the input space from 6D to 1D is aimed at minimizing the computation cost. We have attained commendable results outperforming the state-of-the-art approaches by achieving an average accuracy and F1 score of 98% for 6D input size. The potential implications of this research are particularly relevant in the realm of smart healthcare, with a focus on the elderly and differently-abled population.},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {article}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_75" style="display:none;"><div class="tp_abstract_entry">Falling poses a significant challenge to the health and well-being of the elderly and people with various disabilities. Precise and prompt fall detection plays a crucial role in preventing falls and mitigating the impact of injuries. In this research, we propose a deep classifier for pre-impact fall detection which can detect a fall in the pre-impact phase with an inference time of 46–52 milliseconds. The proposed classifier is an ensemble of Convolutional Neural Networks (CNNs) and Bidirectional Gated Recurrent Units (BiGRU) with residual connections. We validated the performance of the proposed classifier on a comprehensive, publicly available preimpact fall dataset. The dataset covers 36 diverse activities, including 15 types of fall-related activities and 21 types of activities of daily living (ADLs). Furthermore, we evaluated the proposed model using three different inputs of varying dimensions: 6D input (comprising 3D accelerations and 3D angular velocities), 3D input (3D accelerations), and 1D input (magnitude of 3D accelerations). The reduction in the input space from 6D to 1D is aimed at minimizing the computation cost. We have attained commendable results outperforming the state-of-the-art approaches by achieving an average accuracy and F1 score of 98% for 6D input size. The potential implications of this research are particularly relevant in the realm of smart healthcare, with a focus on the elderly and differently-abled population.</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_75" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1109/JSEN.2024.3407835" title="Follow DOI:10.1109/JSEN.2024.3407835" target="_blank">doi:10.1109/JSEN.2024.3407835</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('75','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Interview with Chefärztebrief</title>
		<link>https://digital-health-bonn.de/interview-with-chefarztebrief/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Fri, 18 Aug 2023 12:05:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[interview]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=255</guid>

					<description><![CDATA[In an interview with the German portal Chefärztebrief, Prof. Dr. Björn Krüger, head of the &#8220;Personalized Digital Health and Telemedicine&#8221; research group at the Departement for Epileptology at the University Hospital Bonn (UKB), shared groundbreaking insights into the future of [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>In an interview with the German portal Chefärztebrief, Prof. Dr. Björn Krüger, head of the &#8220;Personalized Digital Health and Telemedicine&#8221; research group at the Departement for Epileptology at the University Hospital Bonn (UKB), shared groundbreaking insights into the future of patient care through digital technology. The interview, titled &#8220;Digitally recorded movement patterns of patients can support many disciplines!&#8221;, delves into Prof. Krüger&#8217;s research on the digital capture of human data, including movement and vital parameters.</p>



<p>Prof. Krüger, whose work stands at the intersection of healthcare and technology, discussed how digital tools and methodologies are revolutionizing the way patient data is collected and analyzed. This includes everything from the way we understand movement patterns to how we monitor vital signs, offering a more comprehensive view of patient health than ever before.</p>



<p>You can find the whole interview <a href="https://www.iww.de/cb/management/grundlagenforschung-digital-aufgezeichnete-bewegungsmuster-der-patienten-koennen-viele-faecher-unterstuetzen-f155137">here</a> (paywall).</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
