Back to projects
In progressNext.jsTypeScriptPWASensorSignal processingBluetooth

CPR Trainer

Thesis prototype: a training PWA that turns smartphone/wearable accelerometer data into real-time CPR (cardiopulmonary resuscitation) compression feedback and session review.

CPR Trainer mockup

Introduction

A CPR practice companion that uses accelerometer data (phone or wearable) to provide real-time feedback on chest compressions, without requiring dedicated training hardware.

Background

This project builds directly on the ECG prototype, which demonstrated that a browser-based stack can ingest live sensor streams and render them in real time.

That finding made this project feasible. Most devices already contain an accelerometer and enough compute to collect signals, process them, and provide feedback. The benefit is tangible: CPR quality directly affects outcomes, especially in the first minutes before professional help arrives. Lowering the barrier to practice—with feedback on technique—can make training more accessible outside specialized facilities.

The thesis currently focuses on the algorithmic core: transforming raw sensor bytes into calibrated acceleration and refining that stream into meaningful compression metrics. A key research task is establishing a stable relationship between acceleration signals and a useful depth/quality proxy under real-world conditions. The next stage is packaging these methods into a robust, cross-device implementation.

Problem

Many people who want to practice CPR only have access to a smartphone and do not have access to instructor-led training or feedback manikins. Without guided feedback and repetition, it’s difficult to maintain correct rate, rhythm, and depth consistency, especially while staying calm and executing the full procedure.

From an engineering perspective, two challenges stand out:

  • Signal interpretation: wrist/phone accelerometer data is noisy and orientation-dependent. Turning acceleration into a meaningful proxy for compression quality requires careful processing and calibration. Naive methods (such as double integration) drift quickly and do not generalize across devices, placements, or technique.

  • Cross-device reliability: unlike microphones or headphones, accelerometer access and behavior vary widely between devices and wearables. For feedback to be trustworthy, the system must reliably acquire motion data at sufficient quality and normalize device-specific formats into a consistent stream the algorithm can use.

Solution

A prototype measurement + feedback loop built around a controlled sensor pipeline and a portable delivery model:

  • Reliable data source: a custom wearable streams 3‑axis acceleration in a format and sampling profile the pipeline can trust, allowing the signal‑processing work to be validated without device-specific noise dominating early development.

  • Processing into metrics: the app decodes and normalizes incoming samples and applies a signal‑processing pipeline to extract compression cycles and training‑relevant metrics.

  • Real-time feedback: the UI reports rate/rhythm and provides a depth‑quality classification (shallow / correct / deep) designed for immediate self‑correction during practice.

  • Session capture for iteration: sessions can be recorded and replayed to support offline analysis, threshold tuning, and calibration against a CPR simulator.

Next stages expand input options (phone sensors and commercial wearables) while keeping the same evaluation logic and training workflow.

How it works (current prototype)

  • Connect a device: pair the custom wearable via Bluetooth (BLE) from the PWA (prototype stage).
  • Start a session: select a calibration profile, then begin compressions. The live view shows rate and rhythm updates as you practice.
  • Get live feedback: the app flags rate/rhythm drift and classifies compressions (shallow / correct / deep) to support immediate correction.
  • Record and replay: capture a segment and replay it to review consistency and drift over time.
  • Analyze recordings: recordings feed offline tuning and calibration against a CPR simulator.

Architecture (current prototype)

  • Wearable firmware: streams 3‑axis accelerometer samples over BLE notifications.
  • PWA ingestion: Web Bluetooth connection subscribes to the stream and normalizes samples into a consistent internal format.
  • Processing loop: rolling buffer + real‑time signal processing to detect cycles and compute metrics.
  • Offline analysis tooling: external scripts process larger recorded datasets to tune thresholds and validate improvements before they are brought back into the real‑time pipeline.

Note: this is a prototype in active development; the workflow stays stable, while internal implementation details may evolve as device support expands.

Key engineering choices

  • Accessible by design: the goal is feedback from sensors people already have (phone or wearable), without relying on specialized hardware.
  • Algorithm first: the work prioritizes the signal processing and calibration against a CPR simulator before expanding device support.
  • Robust feedback over perfect modeling: the focus is consistent, repeatable guidance across different placements and motion artifacts, rather than a “perfect physics” reconstruction.
  • Works under rotation: detection and classification are built to hold up even when the phone/wrist orientation changes, instead of depending on a single axis.

Results (prototype stage)

  • Working core from sensor to web application over Web Bluetooth.
  • Reliable compression detection from accelerometer signals.
  • Real‑time compression rate and depth classification.
  • Session capture plus offline analysis workflow to validate thresholds and support calibration.