|
1 | 1 | # Python OEphys |
2 | 2 |
|
3 | | -A comprehensive Python toolkit for working with Open Ephys devices, featuring signal processing, machine learning, and real-time visualization tools. This package seamlessly integrates with the Open Ephys GUI via ZMQ. |
| 3 | +[](https://www.python.org/) |
| 4 | +[](https://opensource.org/licenses/MIT) |
| 5 | +[](https://badge.fury.io/py/python-oephys) |
4 | 6 |
|
5 | | - |
6 | | - |
| 7 | +**python-oephys** is a comprehensive Python toolkit for working with Open Ephys devices and electrophysiology data. From file loading to real-time ZMQ streaming, signal processing to machine learning, and visualization tools—everything you need for high-density neural data analysis in one package. |
7 | 8 |
|
8 | | -## Features |
| 9 | +--- |
9 | 10 |
|
10 | | -### 🖥️ Applications (GUI) |
11 | | -- **Real-Time EMG Viewer**: Live ZMQ streaming plot with signal quality checks (`channels` loaded dynamically). |
12 | | -- **Offline EMG Viewer**: Advanced offline analysis of `.npz` recordings with filtering and spectrograms. |
13 | | -- **Trial Selector**: Tool for manual segmentation and labeling of EMG trials. |
| 11 | +## ✨ Key Features |
14 | 12 |
|
15 | | -### 🧠 Machine Learning |
16 | | -- **EMGClassifierCNNLSTM**: Hybrid CNN-LSTM model for spatio-temporal gesture recognition. |
17 | | -- **Model Manager**: Utilities to save, load, and manage PyTorch models. |
18 | | -- **Evaluation**: Metrics and tools for evaluating model performance. |
| 13 | +- 📁 **File I/O**: Robust support for Open Ephys Binary (`.oebin`) and `.npz` formats |
| 14 | +- 🔴 **Real-time Streaming**: Seamless integration with the Open Ephys GUI via ZMQ |
| 15 | +- 🎛️ **Signal Processing**: Filtering (Bandpass, Notch), Channel QC, and synchronization |
| 16 | +- 🤖 **Machine Learning**: Hybrid CNN-LSTM models for real-time gesture recognition |
| 17 | +- 📊 **Visualization**: Real-time EMG viewer, offline analysis, and trial segmentation tools |
| 18 | +- 🚀 **Performance**: Optimized for low-latency real-time applications |
19 | 19 |
|
20 | | -### 📡 Signal Processing |
21 | | -- **Channel QC**: Automated signal quality assessment (noise, saturation). |
22 | | -- **Synchronization**: Tools to align multi-modal data (e.g., EMG + Motion Capture/Video). |
23 | | -- **Filtering**: Real-time and offline filters (Bandpass, Notch, Smoothing). |
24 | | -- **Features**: Extract RMS, MAV, Zero Crossings, and IMU features. |
| 20 | +--- |
25 | 21 |
|
26 | | -### 💾 Data & IO |
27 | | -- **Unified IO**: Robust support for Open Ephys Binary (`.oebin`) and `.npz` formats. |
28 | | -- **Dataset Builder**: Advanced tools to discover, group, and merge multi-session datasets. |
29 | | -- **Spatial Mapping**: Grid reorientation tools (`rot90`, `flipH`, etc.) for high-density arrays. |
| 22 | +## 📦 Installation |
30 | 23 |
|
31 | | -## Installation |
| 24 | +### From TestPyPI (Current Development Release) |
| 25 | + |
| 26 | +```bash |
| 27 | +pip install --index-url https://test.pypi.org/simple/ --no-deps python-oephys |
| 28 | +``` |
32 | 29 |
|
33 | 30 | ### From Source |
34 | | -1. **Clone the repository**: |
35 | | - ```bash |
36 | | - git clone https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys.git |
37 | | - cd python-open-ephys |
38 | | - ``` |
39 | | - |
40 | | -2. **Create a virtual environment** (recommended): |
41 | | - ```bash |
42 | | - python -m venv .venv |
43 | | - # Windows |
44 | | - .venv\Scripts\activate |
45 | | - # Linux/Mac |
46 | | - source .venv/bin/activate |
47 | | - ``` |
48 | | - |
49 | | -3. **Install dependencies**: |
50 | | - ```bash |
51 | | - pip install -e . |
52 | | - ``` |
53 | | - |
54 | | -### Dependencies |
55 | | -- `numpy`, `scipy`, `matplotlib`, `pandas` |
56 | | -- `torch` (for ML models) |
57 | | -- `pyqt5`, `pyqtgraph` (for GUIs) |
58 | | -- `pyzmq` (for streaming) |
59 | | -- `open-ephys-python-tools` |
60 | | - |
61 | | -## Usage |
62 | | - |
63 | | -### 1. Real-Time Viewer |
64 | | -Launch the real-time plotter to visualize live data from Open Ephys: |
| 31 | + |
65 | 32 | ```bash |
66 | | -python -m pyoephys.applications._realtime_viewer --host 127.0.0.1 --channels 0 1 2 3 |
| 33 | +git clone https://github.com/Neuro-Mechatronics-Interfaces/python-open-ephys.git |
| 34 | +cd python-open-ephys |
| 35 | +pip install -e . |
67 | 36 | ``` |
68 | | -*Note: Ensure the ZMQ Interface plugin is active in Open Ephys GUI.* |
69 | 37 |
|
70 | | -### 2. Machine Learning |
71 | | -Train a CNN-LSTM model for gesture recognition: |
72 | | -```python |
73 | | -from pyoephys.ml import EMGClassifierCNNLSTM |
74 | | -import torch |
| 38 | +### Optional Extras |
75 | 39 |
|
76 | | -# Initialize model (4 classes, 8 channels, 200ms window) |
77 | | -model = EMGClassifierCNNLSTM(num_classes=4, num_channels=8, input_window=200) |
| 40 | +- **GUI**: `pip install 'python-oephys[gui]'` (PyQt5, pyqtgraph) |
| 41 | +- **ML**: `pip install 'python-oephys[ml]'` (PyTorch, scikit-learn) |
| 42 | +- **Docs**: `pip install 'python-oephys[docs]'` (Sphinx) |
78 | 43 |
|
79 | | -# Train (see examples/ml/train_cnn_lstm.py for full loop) |
80 | | -# model.fit(X_train, y_train) |
81 | | -``` |
| 44 | +--- |
| 45 | + |
| 46 | +## 🚀 Getting Started |
| 47 | + |
| 48 | +### Load and Filter Data |
82 | 49 |
|
83 | | -### 3. Signal Processing |
84 | | -Check signal quality of your recording: |
85 | 50 | ```python |
86 | | -from pyoephys.processing import ChannelQC |
| 51 | +from pyoephys.io import load_open_ephys_session |
| 52 | +from pyoephys.processing import filter_emg |
87 | 53 |
|
88 | | -qc = ChannelQC(fs=2000) |
89 | | -results = qc.compute_qc(data_chunk) |
90 | | -print(results) # Status (Good/Bad) per channel |
| 54 | +# Load session |
| 55 | +sess = load_open_ephys_session('path/to/recording.oebin') |
| 56 | +data = sess['amplifier_data'] |
| 57 | +fs = sess['sample_rate'] |
| 58 | + |
| 59 | +# Apply filters |
| 60 | +filtered = filter_emg(data, filter_type='bandpass', lowcut=10, highcut=500, fs=fs) |
91 | 61 | ``` |
92 | 62 |
|
93 | | -### 4. Unified CLI Tools |
94 | | -Build a dataset from multiple sessions and train/predict with a single command workflow: |
95 | | -```bash |
96 | | -# 1. Build Dataset (Auto-discovery, Preprocessing, Merging) |
97 | | -python examples/gesture_classifier/1_build_dataset.py --root_dir ./data --multi_file --paper_style |
| 63 | +### Real-time ZMQ Streaming |
98 | 64 |
|
99 | | -# 2. Train Model |
100 | | -python examples/gesture_classifier/2_train_model.py --root_dir ./data --label my_model |
| 65 | +```bash |
| 66 | +# Launch the live viewer (ensure ZMQ Interface plugin is active in GUI) |
| 67 | +python -m pyoephys.applications._realtime_viewer --host 127.0.0.1 --channels 0:8 |
| 68 | +``` |
101 | 69 |
|
102 | | -# 3. Predict (Offline or Real-time Stream) |
103 | | -python examples/gesture_classifier/predict.py stream --root_dir ./data --label my_model |
| 70 | +--- |
| 71 | + |
| 72 | +## 🗂️ Package Structure |
| 73 | + |
| 74 | +```text |
| 75 | +pyoephys/ |
| 76 | +├── applications/ # GUI applications (Real-time & Offline viewers) |
| 77 | +├── interface/ # ZMQ, LSL, and playback clients |
| 78 | +├── io/ # Unified file loaders (.oebin, .npz) |
| 79 | +├── ml/ # Gesture classification (CNN-LSTM) |
| 80 | +├── plotting/ # Visualization utilities |
| 81 | +└── processing/ # Signal filters, QC, and synchronization |
| 82 | +
|
| 83 | +examples/ |
| 84 | +├── benchmarks/ # Throughput and latency tests |
| 85 | +├── interface/ # LSL, ZMQ, and hardware control |
| 86 | +│ ├── hardware/ # Serial/UDP Pico integration |
| 87 | +│ ├── imu/ # Sleeve IMU client & monitor |
| 88 | +│ ├── lsl/ # LSL streaming & capture |
| 89 | +│ └── zmq/ # ZMQ clients & plotters |
| 90 | +└── machine_learning/ # Model training and evaluation |
104 | 91 | ``` |
105 | 92 |
|
106 | | -## Examples |
107 | | -Check the `examples/` directory for complete scripts: |
108 | | -- `examples/gesture_classifier/2_train_model.py`: Train a gesture classifier. |
109 | | -- `examples/synchronization/sync_multimodal_data.py`: Align EMG with 3D hand landmarks. |
110 | | -- `examples/analysis/run_channel_qc.py`: Run quality control checks. |
111 | | -- `examples/interface/zmq/zmq_client.py`: Real-time ZMQ client example. |
112 | | -- `examples/read_files/example_load_oebin_file.py`: Load Open Ephys data. |
113 | | - |
114 | | -## License |
115 | | -MIT License. See [LICENSE](LICENSE) for details. |
116 | | - |
117 | | -## Development & Release |
118 | | - |
119 | | -### Versioning |
120 | | -This project uses **dynamic versioning** via `setuptools_scm`. The version is automatically derived from Git tags. |
121 | | - |
122 | | -### How to Release |
123 | | -1. **Commit** all changes. |
124 | | -2. **Tag** the commit with the new version number: |
125 | | - ```bash |
126 | | - git tag -a v0.1.0 -m "Release v0.1.0" |
127 | | - ``` |
128 | | -3. **Push** the tag to GitHub: |
129 | | - ```bash |
130 | | - git push origin v0.1.0 |
131 | | - ``` |
132 | | -4. **Create a Release** on GitHub: |
133 | | - - Go to **Releases** > **Draft a new release**. |
134 | | - - Select the tag `v0.1.0`. |
135 | | - - Click **Publish release**. |
136 | | -5. The `publish.yml` workflow will trigger, build the package, and upload to PyPI. |
| 93 | +--- |
| 94 | + |
| 95 | +## 📄 License |
| 96 | + |
| 97 | +This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. |
| 98 | + |
| 99 | +--- |
| 100 | + |
| 101 | +<p align="center"> |
| 102 | + Made with ❤️ by the Neuromechatronics Lab |
| 103 | +</p> |
0 commit comments