SensorHub allows us to access raw data from sensors, run projects and develop machine learning models with extracted data. The app is modular, so it can be extended when new devices are needed for new projects. A backend running on a server provides long-term, centralized storage of configuration and sensor data.
In the current bachelor project, new functionalities for the SensorHub system are implemented and validated to support the so-called biofeedback. Roughly speaking, biofeedback is the reaction (e.g. in the form of haptics, acoustics or visuality) to previously defined vital parameters. For example, one can imagine that the smartwatch can remind one via vibration that it is time for a break during hours of programming, as it has been determined on the basis of physiological parameters (e.g. given by EEG measurements) that the concentration has rapidly decreased. Or, certain physiological measurements and derived metrics could be continuously visualized to enable better self-assessment and regulation of psychological and physical responses.