From be629c16ab0c1008d0bc419c62e55097b9719c51 Mon Sep 17 00:00:00 2001 From: Kavish Devar Date: Fri, 14 Mar 2025 17:40:34 +0530 Subject: [PATCH] Update README.md --- head-tracking/README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/head-tracking/README.md b/head-tracking/README.md index 8a3e2f5..e162b9d 100644 --- a/head-tracking/README.md +++ b/head-tracking/README.md @@ -1,6 +1,6 @@ -# AirPods Head Tracking Project +# AirPods Head Tracking Visualizer -This project implements head tracking with AirPods by gathering sensor data over Bluetooth, processing orientation and acceleration values, and detecting head gestures. The codebase is split into the following components: +This part of the project implements head tracking with AirPods by gathering sensor data over l2cap, processing orientation and acceleration values, and detecting head gestures. The codebase is split into the following components: - **Connection and Data Collection** The project uses a custom ConnectionManager (imported in multiple files) to connect via Bluetooth to AirPods. Once connected, sensor packets are received in raw hex format. An AirPodsTracker class (in `plot.py`) handles the start/stop of tracking, logging of raw data, and parsing of packets into useful fields. @@ -43,4 +43,4 @@ This project implements head tracking with AirPods by gathering sensor data over - **Alternation Factor:** Verifies that the signal alternates (for instance, switching between positive and negative values). - **Isolation Factor:** Checks that movement on the target axis (vertical for nodding, horizontal for shaking) dominates over the non-target axis. - A weighted sum of these factors forms a confidence score which, if above a predefined threshold (e.g. 0.7), confirms a detected gesture. \ No newline at end of file + A weighted sum of these factors forms a confidence score which, if above a predefined threshold (e.g. 0.7), confirms a detected gesture.