Skip to content

Commit edea11f

Browse files
committed
Add joint angle regression LSL-first example and Open Ephys streamer
1 parent 75912a0 commit edea11f

8 files changed

Lines changed: 8797 additions & 0 deletions

File tree

Lines changed: 354 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,354 @@
1+
# Joint Angle Regression from EMG
2+
3+
**Comprehensive GUI for collecting synchronized EMG and joint angle data, training regression models, and live comparison.**
4+
5+
Adapted from the MindRove-EMG `new_session_gui.py` to work with Open Ephys ZMQ streaming.
6+
7+
## Features
8+
9+
- 📊 **Visual Flow Diagram**: See your pipeline at a glance
10+
- 🎯 **Guided Prompts**: Follow structured movement protocols
11+
- 📝 **In-GUI Recording**: Synchronized EMG + angle capture with windowing
12+
- 🧪 **EMG Filtering**: Optional highpass, notch, and lowpass filters
13+
- 📡 **IMU Integration**: Optional Sleeve IMU for orientation tracking (RPY)
14+
- 🤖 **Model Training**: Launch training scripts directly from GUI
15+
- 🔴 **Live Comparison**: Real-time prediction vs. ground truth
16+
- 📈 **Comprehensive Logging**: Track all operations
17+
18+
## Overview
19+
20+
This example demonstrates how to:
21+
1. **Stream EMG data** from Open Ephys GUI via ZMQ
22+
2. **Receive joint angles** from a hand tracking system via LSL
23+
3. **Record synchronized data** for regression training
24+
4. **Train regression models** to predict joint angles from EMG
25+
26+
## Prerequisites
27+
28+
### Hardware
29+
- Open Ephys acquisition system with EMG amplifier
30+
- Camera system or hand tracking device (outputting to LSL)
31+
32+
### Software
33+
```bash
34+
# Install python-open-ephys
35+
pip install --index-url https://test.pypi.org/simple/ --no-deps python-oephys
36+
37+
# Install required packages
38+
pip install numpy PyQt5 pylsl
39+
```
40+
41+
### System Setup
42+
1. **Open Ephys GUI**: Launch with ZMQ Interface plugin enabled
43+
- Configure your EMG channels (e.g., 8 channels at 5000 Hz)
44+
- Note the ZMQ port (default: 5556)
45+
46+
2. **Hand Tracking**: Any system that broadcasts joint angles via LSL
47+
- Examples: MediaPipe hand tracking, finger goniometers, motion capture
48+
- Stream type: `JointAngles` or custom name
49+
- Typical output: 5 angles [MCP, PIP, DIP, Thumb_MCP, Thumb_IP]
50+
51+
## Quick Start
52+
53+
### Step 1: Launch Data Collection GUI
54+
55+
```bash
56+
cd python-open-ephys/examples/joint_angle_regression
57+
python new_session_gui.py
58+
```
59+
60+
Or on Windows:
61+
```batch
62+
run_gui.bat
63+
```
64+
65+
### Step 2: Connect to Data Sources
66+
67+
1. **EMG (Open Ephys)**:
68+
- Set ZMQ Host: `127.0.0.1` (or IP of Open Ephys computer)
69+
- Set ZMQ Port: `5556` (match Open Ephys ZMQ Interface settings)
70+
- Set EMG sampling rate: `5000` Hz (or your actual rate)
71+
- Set number of channels: `8` (or your actual count)
72+
- Click **"Connect"**
73+
- Verify status shows "Streaming" (green)
74+
75+
2. **Joint Angles (LSL)**:
76+
- Click **"Connect LSL"**
77+
- GUI will search for streams with type `JointAngles`
78+
- Verify status shows connected stream name (green)
79+
80+
3. **IMU (Optional - Sleeve IMU)**:
81+
- Check **"Enable Sleeve IMU"** checkbox
82+
- Set IMU Host: `192.168.4.1` (Sleeve IMU default IP)
83+
- Set IMU Port: `5555` (default)
84+
- Select Transport: `UDP` or `TCP`
85+
- Click **"Connect"** (IMU connects automatically with EMG)
86+
- Verify IMU status shows orientation data (e.g., "R10.5° P-5.2° Y45.3°")
87+
88+
### Step 3: Record Training Data
89+
90+
1. **Enter metadata**:
91+
- Subject ID: `P001`
92+
- Session ID: `S01`
93+
- Notes: `baseline, relaxed grip`
94+
95+
2. **Record data**:
96+
- Click **"Start Recording"**
97+
- Perform hand movements:
98+
- Open/close hand slowly (10 reps)
99+
- Individual finger flexion/extension (5 reps each)
100+
- Grip variations (power, pinch, precision)
101+
- Natural movements (reaching, grasping objects)
102+
- Recommended duration: **2-5 minutes**
103+
- Click **"Stop & Save"**
104+
105+
3. **Output**:
106+
```
107+
data/sub-P001_ses-S01_emg-angles.npz
108+
```
109+
110+
### Step 4: Train Regression Model
111+
112+
After collecting data, train a regression model to map EMG → joint angles.
113+
114+
**Using Hand-Landmark-Tracker Pipeline** (recommended):
115+
```bash
116+
# Navigate to Hand-Landmark-Tracker example
117+
cd ../../Hand-Landmark-Tracker/examples/Joint_Kinematics_from_EMG_OpenEphys
118+
119+
# The GUI saves data in compatible format - use it directly
120+
# See that repository's README for full training pipeline
121+
```
122+
123+
**Using Custom Training Script**:
124+
```python
125+
import numpy as np
126+
from sklearn.ensemble import RandomForestRegressor
127+
128+
# Load recorded data
129+
data = np.load('data/sub-P001_ses-S01_emg-angles.npz')
130+
emg = data['emg'] # (samples, channels)
131+
angles = data['angles'] # (samples, n_angles)
132+
emg_timestamps = data['emg_timestamps']
133+
angle_timestamps = data['angle_timestamps']
134+
135+
# Align timestamps (LSL synchronization)
136+
# ... implement alignment logic ...
137+
138+
# Extract features from EMG
139+
# ... implement preprocessing (notch, bandpass, envelope) ...
140+
141+
# Train model
142+
model = RandomForestRegressor()
143+
model.fit(features, angles)
144+
```
145+
146+
## GUI Features
147+
148+
### Experiment Panel
149+
- **Subject ID**: Participant identifier
150+
- **Session ID**: Session/condition identifier
151+
- **Notes**: Free-form metadata
152+
153+
### EMG Acquisition Panel
154+
- **Connection Settings**:
155+
- ZMQ Host: IP address of Open Ephys computer
156+
- ZMQ Port: ZMQ Interface data port (default 5556)
157+
- EMG fs: Sampling frequency in Hz
158+
- Channels: Number of EMG channels to record
159+
- **IMU Settings** (Optional):
160+
- Enable Sleeve IMU: Checkbox to enable/disable IMU
161+
- IMU Host: IP address of Sleeve IMU device (default 192.168.4.1)
162+
- IMU Port: UDP/TCP port (default 5555)
163+
- Transport: UDP (recommended) or TCP
164+
- **Live Monitoring**:
165+
- Samples buffered: Current buffer size
166+
- EMG RMS: Root mean square of signal
167+
- EMG σ: Standard deviation
168+
- IMU: Roll/Pitch/Yaw orientation (when enabled)
169+
- Update rate: GUI refresh rate
170+
- **Controls**:
171+
- Connect/Disconnect: Manage ZMQ connection (and IMU if enabled)
172+
- Auto-reconnect: Automatically reconnect if connection drops
173+
174+
### Joint Angle Input Panel
175+
- **LSL Connection**: Searches for streams with type `JointAngles`
176+
- **Status**: Shows connected stream name and rate
177+
- **Compatibility**: Works with any LSL source (hand tracking, goniometers, etc.)
178+
179+
### Recording Panel
180+
- **Output Path**: Auto-generated from subject/session or custom
181+
- **Controls**: Start/stop recording
182+
- **Status**: Shows recording progress and save confirmation
183+
184+
## Data Format
185+
186+
Saved NPZ files contain:
187+
188+
```python
189+
{
190+
'emg': ndarray, shape (samples, channels)
191+
# EMG data in microvolts or raw ADC units
192+
193+
'emg_timestamps': ndarray, shape (samples,)
194+
# LSL timestamps for each EMG sample
195+
196+
'angles': ndarray, shape (samples, n_angles)
197+
# Joint angles in degrees or radians
198+
199+
'angle_timestamps': ndarray, shape (samples,)
200+
# LSL timestamps for each angle sample
201+
202+
'imu': ndarray, shape (samples, 9)
203+
# IMU data: [roll, pitch, yaw, accel_x, accel_y, accel_z, gyro_x, gyro_y, gyro_z]
204+
# Note: Sleeve IMU only provides RPY, other channels are zeros
205+
# Synchronized to EMG timestamps
206+
207+
'emg_fs': float
208+
# EMG sampling frequency (Hz)
209+
210+
'emg_channels': int
211+
# Number of EMG channels
212+
213+
'subject': str
214+
# Subject ID
215+
216+
'session': str
217+
# Session ID
218+
219+
'notes': str
220+
# Session notes
221+
}
222+
```
223+
224+
### Timestamp Synchronization
225+
226+
Both EMG and angles use **LSL timestamps** (`pylsl.local_clock()`), enabling precise synchronization even across different computers. This is critical for regression training.
227+
228+
## Typical Workflow
229+
230+
### 1. Calibration Session (5-10 minutes)
231+
- Record baseline data with no EMG activity
232+
- Record maximum voluntary contraction (MVC) for each muscle
233+
- Record full range of motion for each joint
234+
235+
### 2. Training Data Collection (15-30 minutes)
236+
- Multiple sessions with varied movements:
237+
- Session 1: Slow, controlled movements
238+
- Session 2: Fast, dynamic movements
239+
- Session 3: Object manipulation tasks
240+
- Merge data from multiple sessions for robust training
241+
242+
### 3. Model Training
243+
- Preprocess EMG (notch filter, bandpass, envelope)
244+
- Extract features (RMS, MAV, waveform properties)
245+
- Train regression model (MLP, Random Forest, or Transformer)
246+
- Validate with held-out test set
247+
248+
### 4. Real-time Prediction
249+
- Use trained model with live ZMQ stream
250+
- See `python-open-ephys/examples/gesture_classifier/3_predict_realtime.py` for template
251+
252+
## Troubleshooting
253+
254+
### "python-oephys missing"
255+
```bash
256+
pip install --index-url https://test.pypi.org/simple/ --no-deps python-oephys
257+
pip install numpy zmq
258+
```
259+
260+
### "No LSL stream found"
261+
- Verify hand tracking system is running
262+
- Check that LSL broadcast is enabled
263+
- Try alternative stream names in LSL search
264+
- Use `pylsl` utilities to list available streams:
265+
```python
266+
from pylsl import resolve_streams
267+
streams = resolve_streams(wait_time=2.0)
268+
for s in streams:
269+
print(f"{s.name()}: {s.type()}")
270+
```
271+
272+
### EMG signal quality issues
273+
- Check electrode impedance
274+
- Verify channel mapping in Open Ephys
275+
- Adjust gain settings if signal is clipping or too small
276+
- Use EMG RMS/σ display to monitor signal quality
277+
278+
### Timestamp alignment errors
279+
- Ensure both systems use LSL clock (`pylsl.local_clock()`)
280+
- Check for clock drift over long recordings (>10 minutes)
281+
- Verify sampling rates are accurate
282+
283+
## Integration with Hand-Landmark-Tracker
284+
285+
This example is designed to work seamlessly with the [Hand-Landmark-Tracker](https://github.com/Jshulgach/Hand-Landmark-Tracker) repository:
286+
287+
1. **Collect data** using this GUI (`session_gui.py`)
288+
2. **Train models** using Hand-Landmark-Tracker's pipeline:
289+
```bash
290+
cd Hand-Landmark-Tracker/examples/Joint_Kinematics_from_EMG_OpenEphys
291+
python oephys_create_dataset.py # Creates training dataset
292+
cd ../Joint_Kinematics_from_EMG
293+
python train_model.py # Trains PyTorch EMGRegressor
294+
```
295+
3. **Real-time prediction** with trained model
296+
297+
See [Hand-Landmark-Tracker/examples/Joint_Kinematics_from_EMG_OpenEphys/README.md](https://github.com/Jshulgach/Hand-Landmark-Tracker/tree/main/examples/Joint_Kinematics_from_EMG_OpenEphys) for full pipeline documentation.
298+
299+
## Example Use Cases
300+
301+
### 1. Prosthetic Control
302+
- Train regression model to map EMG → desired joint angles
303+
- Use for proportional control of robotic hand
304+
- Real-time performance: <10 ms latency
305+
306+
### 2. Rehabilitation Assessment
307+
- Track recovery of EMG-movement coupling after injury
308+
- Compare affected vs. unaffected limb
309+
- Longitudinal analysis of motor control
310+
311+
### 3. Biomechanics Research
312+
- Study muscle synergies during complex tasks
313+
- Validate musculoskeletal models
314+
- EMG-driven joint angle estimation
315+
316+
## File Structure
317+
318+
```
319+
joint_angle_regression/
320+
├── session_gui.py # Main data collection GUI
321+
├── run_gui.bat # Windows launcher
322+
├── data/ # Recorded datasets (not tracked)
323+
├── models/ # Trained models (not tracked)
324+
└── README.md # This file
325+
```
326+
327+
## Citation
328+
329+
If you use this example in your research, please cite:
330+
331+
```bibtex
332+
@software{python_oephys_2024,
333+
title = {python-open-ephys: Python interface for Open Ephys},
334+
author = {Neuro-Mechatronics Lab},
335+
year = {2024},
336+
url = {https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys}
337+
}
338+
```
339+
340+
## License
341+
342+
MIT License - see python-open-ephys repository root for details.
343+
344+
## Support
345+
346+
- **Issues**: https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys/issues
347+
- **Discussions**: https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys/discussions
348+
- **Email**: Contact NML team via repository
349+
350+
---
351+
352+
**Author**: Neuro-Mechatronics Lab (NML)
353+
**Created**: 2026-02-16
354+
**Updated**: 2026-02-16

0 commit comments

Comments
 (0)