Skip to content

Commit 0f1f73b

Browse files
committed
Complete Phase 4: Release Preparation (Docs, Packaging, Cleanup)
1 parent 58f29d8 commit 0f1f73b

20 files changed

Lines changed: 3258 additions & 444 deletions

README.md

Lines changed: 81 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -1,48 +1,95 @@
11
# Python Open-Ephys
22

3-
This repository provides a set of tools using the Open-Ephys data acquisition system and GUI for electromyography (EMG) data.
3+
A comprehensive Python toolkit for working with Open Ephys devices, featuring signal processing, machine learning, and real-time visualization tools. This package seamlessly integrates with the Open Ephys GUI via ZMQ.
44

5-
<!-- ![intan_logo.png](/assets/intan_logo.png) -->
5+
![Python](https://img.shields.io/badge/python-3.10%2B-blue)
6+
![License](https://img.shields.io/badge/license-MIT-green)
67

8+
## Features
79

8-
Code was written and tested using Windows 11, Python 3.10.
9-
![Python](https://img.shields.io/badge/python-3.10-blue)
10-
![License](https://img.shields.io/badge/license-MIT-green)
10+
### 🖥️ Applications (GUI)
11+
- **Real-Time EMG Viewer**: Live ZMQ streaming plot with signal quality checks (`channels` loaded dynamically).
12+
- **Offline EMG Viewer**: Advanced offline analysis of `.npz` recordings with filtering and spectrograms.
13+
- **Trial Selector**: Tool for manual segmentation and labeling of EMG trials.
14+
15+
### 🧠 Machine Learning
16+
- **EMGClassifierCNNLSTM**: Hybrid CNN-LSTM model for spatio-temporal gesture recognition.
17+
- **Model Manager**: Utilities to save, load, and manage PyTorch models.
18+
- **Evaluation**: Metrics and tools for evaluating model performance.
19+
20+
### 📡 Signal Processing
21+
- **Channel QC**: Automated signal quality assessment (noise, saturation).
22+
- **Synchronization**: Tools to align multi-modal data (e.g., EMG + Motion Capture/Video).
23+
- **Filtering**: Real-time and offline filters (Bandpass, Notch, Smoothing).
24+
- **Features**: Extract RMS, MAV, Zero Crossings, and IMU features.
1125

1226
## Installation
1327

14-
1. Create a virtual environment using [Anaconda](https://www.anaconda.com/products/distribution) or Python's virtualenv
15-
- Using Anaconda:
16-
~~~
17-
conda create -n ephys
18-
conda activate ephys
19-
~~~
20-
- Using Python's virtualenv:
21-
~~~
22-
python3 -m venv .ephys
23-
source .ephys/bin/activate # Linux
24-
call .ephys/Scripts/activate # Windows
25-
~~~
26-
2. Clone the repository and navigate to the project directory
27-
~~~
28-
git clone https://github.com/Neuro-Mechatronics-Interfaces/Python_Open-Ephys.git
29-
cd Python_Open-Ephys
30-
~~~
31-
3. Install dependencies
32-
~~~
33-
pip install -r requirements.txt
34-
~~~
35-
4. Setup Open-Ephys GUI
36-
- Install from the [Open-Ephys website](https://open-ephys.org/gui) and select your system
37-
- Install the [ZQM Plugin](https://open-ephys.github.io/gui-docs/User-Manual/Plugins/ZMQ-Interface.html) for streaming data
28+
### From Source
29+
1. **Clone the repository**:
30+
```bash
31+
git clone https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys.git
32+
cd python-open-ephys
33+
```
34+
35+
2. **Create a virtual environment** (recommended):
36+
```bash
37+
python -m venv .venv
38+
# Windows
39+
.venv\Scripts\activate
40+
# Linux/Mac
41+
source .venv/bin/activate
42+
```
43+
44+
3. **Install dependencies**:
45+
```bash
46+
pip install -e .
47+
```
48+
49+
### Dependencies
50+
- `numpy`, `scipy`, `matplotlib`, `pandas`
51+
- `torch` (for ML models)
52+
- `pyqt5`, `pyqtgraph` (for GUIs)
53+
- `pyzmq` (for streaming)
54+
- `open-ephys-python-tools`
3855

3956
## Usage
4057

41-
The OpenEphysClient class can be easily imported into your current project. The class provides a simple interface to connect to the Open-Ephys GUI and stream data.
58+
### 1. Real-Time Viewer
59+
Launch the real-time plotter to visualize live data from Open Ephys:
60+
```bash
61+
python -m pyoephys.applications._realtime_viewer --host 127.0.0.1 --channels 0 1 2 3
62+
```
63+
*Note: Ensure the ZMQ Interface plugin is active in Open Ephys GUI.*
64+
65+
### 2. Machine Learning
66+
Train a CNN-LSTM model for gesture recognition:
67+
```python
68+
from pyoephys.ml import EMGClassifierCNNLSTM
69+
import torch
70+
71+
# Initialize model (4 classes, 8 channels, 200ms window)
72+
model = EMGClassifierCNNLSTM(num_classes=4, num_channels=8, input_window=200)
73+
74+
# Train (see examples/ml/train_cnn_lstm.py for full loop)
75+
# model.fit(X_train, y_train)
76+
```
4277

78+
### 3. Signal Processing
79+
Check signal quality of your recording:
4380
```python
44-
from open_ephys import OpenEphysClient
45-
client = OpenEphysClient()
46-
samples = client.get_samples(channel=8)
81+
from pyoephys.processing import ChannelQC
82+
83+
qc = ChannelQC(fs=2000)
84+
results = qc.compute_qc(data_chunk)
85+
print(results) # Status (Good/Bad) per channel
4786
```
48-
Check the directory for other demo example scripts
87+
88+
## Examples
89+
Check the `examples/` directory for complete scripts:
90+
- `examples/ml/train_cnn_lstm.py`: Train a gesture classifier.
91+
- `examples/integration/sync_multimodal_data.py`: Align EMG with 3D hand landmarks.
92+
- `examples/processing/run_channel_qc.py`: Run quality control checks.
93+
94+
## License
95+
MIT License. See [LICENSE](LICENSE) for details.
Lines changed: 139 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,139 @@
1+
"""
2+
Multimodal Synchronization Example
3+
----------------------------------
4+
Aligns Open Ephys EMG data with Hand Tracking landmarks.
5+
6+
This script demonstrates:
7+
1. Loading EMG data from an Open Ephys session (or mock data).
8+
2. Loading Landmark data from a .npz file (captured via udp_landmark_logger.py).
9+
3. Computing the movement signal from 3D hand landmarks.
10+
4. Computing the EMG envelope.
11+
5. Finding the temporal offset to align the two streams.
12+
13+
Usage:
14+
python sync_multimodal_data.py --emg <path_to_session> --landmarks landmarks.npz
15+
"""
16+
17+
import argparse
18+
import numpy as np
19+
import matplotlib.pyplot as plt
20+
from pyoephys.processing import (
21+
compute_landmark_movement_signal,
22+
compute_emg_envelope_signal,
23+
find_sync_offset
24+
)
25+
from pyoephys.io import load_open_ephys_session
26+
27+
def create_mock_data(duration=30.0, emg_fs=2000, landmark_fs=30.0, offset_sec=2.5):
28+
"""Generate aligned mock data if no files provided."""
29+
print("Generating mock data...")
30+
t_emg = np.arange(0, duration, 1/emg_fs)
31+
t_lm = np.arange(0, duration, 1/landmark_fs)
32+
33+
# Common activity profile (e.g., 3 bursts)
34+
activity = np.zeros_like(t_emg)
35+
burst_times = [5.0, 15.0, 25.0]
36+
for bt in burst_times:
37+
# gaussian burst
38+
activity += np.exp(-0.5 * ((t_emg - bt - offset_sec) / 0.5)**2)
39+
40+
# EMG = noise modulated by activity
41+
emg_data = np.random.randn(8, len(t_emg)) * (1 + 10 * activity)
42+
43+
# Landmarks = velocity matches activity
44+
# We integrate activity to get position, so derivative (velocity) matches activity
45+
# Here we just fake the landmarks array directly
46+
n_frames = len(t_lm)
47+
landmarks = np.zeros((n_frames, 21, 3))
48+
49+
# Add movement at burst times (unshifted time for landmarks, shifted for EMG)
50+
# Effectively EMG is delayed by offset_sec relative to ground truth event, or vice versa
51+
# Let's say Event happens at t. Landmarks see it at t. EMG sees it at t + offset.
52+
53+
# Re-do:
54+
# Event times: 5, 15, 25
55+
# Landing on landmarks at: 5, 15, 25
56+
# Landing on EMG at: 5+offset, 15+offset, 25+offset
57+
58+
# Landmark movement
59+
for i, t in enumerate(t_lm):
60+
dist = np.min(np.abs(t - np.array(burst_times)))
61+
if dist < 1.0:
62+
# Move index finger
63+
landmarks[i, 8, 1] = np.sin(20 * t) * 0.1
64+
65+
return emg_data, t_emg, landmarks, t_lm
66+
67+
def main():
68+
parser = argparse.ArgumentParser()
69+
parser.add_argument("--emg", type=str, help="Path to Open Ephys Session folder")
70+
parser.add_argument("--landmarks", type=str, help="Path to landmarks.npz")
71+
args = parser.parse_args()
72+
73+
if args.emg and args.landmarks:
74+
# Load Real Data
75+
print(f"Loading EMG from {args.emg}...")
76+
session = load_open_ephys_session(args.emg)
77+
emg_data = session['amplifier_data']
78+
emg_fs = session['sample_rate']
79+
t_emg = np.arange(emg_data.shape[1]) / emg_fs
80+
81+
print(f"Loading Landmarks from {args.landmarks}...")
82+
lm_data = np.load(args.landmarks)
83+
# (T, Hands, 21, 3) -> take first hand
84+
landmarks = lm_data['landmarks'][:, 0, :, :]
85+
t_lm = lm_data['timestamps']
86+
# zero-align timestamps for relative processing if needed,
87+
# usually we use "system time" for both, so we keep absolute.
88+
89+
else:
90+
# Use Mock Data
91+
emg_data, t_emg, landmarks, t_lm = create_mock_data()
92+
emg_fs = 1.0 / (t_emg[1] - t_emg[0])
93+
94+
# 1. Process Landmarks -> Movement Signal
95+
print("Computing landmark movement signal...")
96+
# landmarks shape: (frames, 21, 3)
97+
lm_signal, t_lm_clean = compute_landmark_movement_signal(landmarks, t_lm)
98+
99+
# 2. Process EMG -> Envelope
100+
print("Computing EMG envelope...")
101+
emg_env, t_emg_env = compute_emg_envelope_signal(emg_data, emg_fs)
102+
103+
# 3. Find Sync
104+
print("Calculating synchronization offset...")
105+
# This finds lag such that: emg(t) ~ lm(t + offset)
106+
# Positive offset => Landmarks are DELAYED relative to EMG?
107+
# Check docstring: "Positive offset means landmarks are DELAYED relative to EMG."
108+
sync_result = find_sync_offset(emg_env, t_emg_env, lm_signal, t_lm_clean)
109+
110+
offset = sync_result['offset_sec']
111+
conf = sync_result['confidence']
112+
print(f"\nFound Offset: {offset:.4f} seconds")
113+
print(f"Confidence: {conf:.2f}")
114+
115+
# 4. Plot
116+
plt.figure(figsize=(10, 6))
117+
118+
# Normalize for plotting
119+
def norm(x): return (x - np.mean(x)) / np.std(x)
120+
121+
plt.subplot(2, 1, 1)
122+
plt.title("Original Signals")
123+
plt.plot(t_emg_env, norm(emg_env), label="EMG Envelope", alpha=0.7)
124+
plt.plot(t_lm_clean, norm(lm_signal), label="Landmark Movement", alpha=0.7)
125+
plt.legend()
126+
plt.grid(True)
127+
128+
plt.subplot(2, 1, 2)
129+
plt.title(f"Aligned (Landmarks shifted by {-offset:.2f}s)")
130+
plt.plot(t_emg_env, norm(emg_env), label="EMG Envelope", alpha=0.7)
131+
plt.plot(t_lm_clean - offset, norm(lm_signal), label="Aligned Landmarks", alpha=0.7)
132+
plt.legend()
133+
plt.grid(True)
134+
135+
plt.tight_layout()
136+
plt.show()
137+
138+
if __name__ == "__main__":
139+
main()
Lines changed: 117 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,117 @@
1+
"""
2+
UDP Landmark Logger
3+
-------------------
4+
Listens for hand landmarks broadcasted by the Stereo Hand Tracker (port 5005)
5+
and saves them to a structured .npz file for synchronization with EMG data.
6+
7+
Usage:
8+
python udp_landmark_logger.py --output landmarks.npz --duration 30
9+
"""
10+
11+
import socket
12+
import json
13+
import time
14+
import argparse
15+
import numpy as np
16+
import signal
17+
18+
def main():
19+
parser = argparse.ArgumentParser(description="Log UDP landmarks to NPZ")
20+
parser.add_argument("--port", type=int, default=5005, help="UDP port (default: 5005)")
21+
parser.add_argument("--output", type=str, default="landmarks.npz", help="Output filename")
22+
parser.add_argument("--duration", type=float, default=60.0, help="Recording duration in seconds")
23+
args = parser.parse_args()
24+
25+
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
26+
sock.bind(("0.0.0.0", args.port))
27+
sock.settimeout(1.0)
28+
29+
print(f"Listening on port {args.port}...")
30+
print(f"Recording to {args.output} for {args.duration} seconds...")
31+
print("Press Ctrl+C to stop early.")
32+
33+
timestamps = []
34+
frames = []
35+
# Store landmarks as list of (N_hands, 21, 3) arrays
36+
# Since N_hands can vary, we might just store flat lists and post-process
37+
landmark_history = []
38+
39+
start_time = time.time()
40+
packet_count = 0
41+
42+
try:
43+
while True:
44+
elapsed = time.time() - start_time
45+
if elapsed >= args.duration:
46+
break
47+
48+
try:
49+
data, addr = sock.recvfrom(65535)
50+
packet = json.loads(data.decode("utf-8"))
51+
52+
# Packet format: {'timestamp': float, 'hands': [{'hand_index': i, 'landmarks': [[x,y,z],...]}, ...], ...}
53+
54+
# Use local receive time for strict sync, or packet timestamp if reliable?
55+
# Usually best to trust packet timestamp if source clock is good, or mix.
56+
# Here we'll use packet timestamp if available, else local.
57+
ts = packet.get("timestamp", time.time())
58+
59+
hands_list = packet.get("hands", [])
60+
61+
# Sort hands by index to maximize consistency
62+
hands_list.sort(key=lambda h: h.get("hand_index", 0))
63+
64+
# Extract landmarks: (N_hands, 21, 3)
65+
current_frame_landmarks = []
66+
for h in hands_list:
67+
lm = np.array(h["landmarks"]) # (21, 3)
68+
current_frame_landmarks.append(lm)
69+
70+
if current_frame_landmarks:
71+
# For simple storage, lets just store the first hand, or all if consistent?
72+
# To be robust, let's keep it flexible: store list of objects
73+
landmark_history.append(current_frame_landmarks)
74+
timestamps.append(ts)
75+
frames.append(packet.get("frame", packet_count))
76+
packet_count += 1
77+
78+
if packet_count % 100 == 0:
79+
print(f"\rCaptured {packet_count} frames ({elapsed:.1f}s)", end="")
80+
81+
except socket.timeout:
82+
continue
83+
except KeyboardInterrupt:
84+
break
85+
86+
except KeyboardInterrupt:
87+
print("\nStopping...")
88+
89+
print(f"\nSaving {len(timestamps)} frames to {args.output}")
90+
91+
# Post-process: Pad to max hands to make a dense array?
92+
# Or just pickle the list. NPZ supports object arrays.
93+
# Let's try to make a dense array (T, MaxHands, 21, 3)
94+
if not landmark_history:
95+
print("No data captured.")
96+
return
97+
98+
max_hands = max(len(h) for h in landmark_history)
99+
T = len(timestamps)
100+
dense_landmarks = np.zeros((T, max_hands, 21, 3), dtype=np.float32)
101+
102+
for t, hands in enumerate(landmark_history):
103+
for h_i, hand_lm in enumerate(hands):
104+
if h_i < max_hands:
105+
dense_landmarks[t, h_i] = hand_lm
106+
107+
np.savez_compressed(
108+
args.output,
109+
timestamps=np.array(timestamps),
110+
frames=np.array(frames),
111+
landmarks=dense_landmarks, # (T, Hands, 21, 3)
112+
fs_estimated=T/args.duration
113+
)
114+
print("Done.")
115+
116+
if __name__ == "__main__":
117+
main()

0 commit comments

Comments
 (0)