Get up and running with GeoSync in 5 minutes.
- Python 3.11 or higher
- pip (Python package manager)
- git (version control)
- Basic understanding of Python and trading concepts
git clone https://github.com/neuron7xLab/GeoSync.git
cd GeoSync# Create virtual environment
python -m venv .venv
# Activate it
# On Linux/macOS:
source .venv/bin/activate
# On Windows:
.venv\Scripts\activate# Install runtime packages (locked)
pip install -r requirements.lock
# Install development tools (optional, extends runtime lock)
pip install -r requirements-dev.lock
# Extras: install only what you need
# pip install ".[connectors]" # market & broker integrations
# pip install ".[gpu]" # GPU acceleration backends
# pip install ".[docs]" # documentation toolchainPrefer a single command that provisions everything? Use the bundled bootstrapper from the repository root:
python -m scripts bootstrap --include-dev --verify --smoke-testThis creates .venv, installs the locked dependencies, runs pip check, and
executes a tiny sample analysis to confirm the stack is healthy.
# Test that imports work
python -c "from core.indicators.kuramoto import compute_phase; print('OK')"
# Run tests
pytest tests/ -v# Analyze the included sample.csv
python -m interfaces.cli analyze --csv sample.csv --window 200Expected output:
{
"R": 0.85,
"H": 2.34,
"delta_H": 0.12,
"kappa_mean": 0.45,
"Hurst": 0.58,
"phase": "trending"
}Create a CSV file with this format:
timestamp,close,volume
2024-01-01 00:00:00,50000,100
2024-01-01 00:01:00,50100,150
2024-01-01 00:02:00,49950,120Then analyze:
python -m interfaces.cli analyze --csv your_data.csv# Simple backtest
python -m interfaces.cli backtest \
--csv sample.csv \
--train-window 500 \
--test-window 100 \
--initial-capital 10000Output shows:
- Total return
- Sharpe ratio
- Max drawdown
- Win rate
- Number of trades
Create a file my_analysis.py:
import numpy as np
import pandas as pd
from core.indicators.kuramoto import compute_phase, kuramoto_order
from core.indicators.entropy import entropy
from core.indicators.ricci import build_price_graph, mean_ricci
# Load data
df = pd.read_csv('sample.csv')
prices = df['close'].to_numpy()
# Compute indicators
phases = compute_phase(prices)
R = kuramoto_order(phases[-200:])
H = entropy(prices[-200:])
G = build_price_graph(prices[-200:], delta=0.005)
kappa = mean_ricci(G)
print(f"Kuramoto Order: {R:.3f}")
print(f"Entropy: {H:.3f}")
print(f"Mean Ricci Curvature: {kappa:.3f}")Run it:
python my_analysis.pyCreate my_strategy.py:
import numpy as np
from backtest.engine import walk_forward
def my_signal_function(prices: np.ndarray, window: int = 50) -> np.ndarray:
"""Simple moving average crossover strategy."""
signals = np.zeros(len(prices))
fast_ma = np.convolve(prices, np.ones(window)//window, mode='valid')
slow_ma = np.convolve(prices, np.ones(window*2)//(window*2), mode='valid')
# Align arrays
min_len = min(len(fast_ma), len(slow_ma))
fast_ma = fast_ma[-min_len:]
slow_ma = slow_ma[-min_len:]
# Generate signals
for i in range(1, len(fast_ma)):
if fast_ma[i] > slow_ma[i] and fast_ma[i-1] <= slow_ma[i-1]:
signals[-(min_len-i)] = 1 # Buy signal
elif fast_ma[i] < slow_ma[i] and fast_ma[i-1] >= slow_ma[i-1]:
signals[-(min_len-i)] = -1 # Sell signal
return signals
# Load data
import pandas as pd
df = pd.read_csv('sample.csv')
prices = df['close'].to_numpy()
# Backtest
results = walk_forward(
prices=prices,
signal_func=my_signal_function,
train_window=500,
test_window=100,
initial_capital=10000.0
)
print(f"Total Return: {results['total_return']:.2%}")
print(f"Sharpe Ratio: {results['sharpe_ratio']:.2f}")
print(f"Max Drawdown: {results['max_drawdown']:.2%}")Run:
python my_strategy.pyIf you want to use monitoring features:
# Start Prometheus and Grafana
docker compose up -d prometheus grafana
# Access Grafana at http://localhost:3000
# Default credentials: admin/adminFor large datasets or production deployments, use performance optimizations:
import numpy as np
from core.indicators.entropy import EntropyFeature
from core.indicators.hurst import HurstFeature
from core.data.preprocess import scale_series
# Load large dataset
large_data = np.random.randn(1_000_000)
# Memory-efficient processing with float32
entropy_feat = EntropyFeature(
bins=50,
use_float32=True, # 50% memory reduction
chunk_size=100_000 # Process in chunks
)
hurst_feat = HurstFeature(
use_float32=True
)
# Scale data efficiently
scaled = scale_series(large_data, use_float32=True)
# Compute indicators with monitoring
result = entropy_feat.transform(scaled)
print(f"Entropy: {result.value:.4f}")
# Enable structured logging
from core.utils.logging import configure_logging
configure_logging(level="INFO", use_json=True)
# Start Prometheus metrics server
from core.utils.metrics import start_metrics_server
start_metrics_server(port=8000) # Metrics at http://localhost:8000/metricsSee the Performance Optimization Guide for details.
Now that you're up and running, explore:
- Performance Guide - Memory optimization and execution profiling
- Indicators Guide - Learn about available indicators
- Backtesting Guide - Advanced backtesting features
- Execution Guide - Live trading setup
- Extending GeoSync - Add custom indicators and strategies
- Integration API - Connect to exchanges
- Read CONTRIBUTING.md
- Set up your IDE
- Run the full test suite:
pytest tests/ \ --cov=core --cov=backtest --cov=execution \ --cov-config=configs/quality/critical_surface.coveragerc \ --cov-report=term-missing --cov-report=xml python -m tools.coverage.guardrail \ --config configs/quality/critical_surface.toml \ --coverage coverage.xml
- Explore the codebase
- Get API keys from your exchange
- Set up paper trading first
- Test strategies thoroughly
- Start with small position sizes
- Monitor closely
- Read the Architecture Documentation
- Study the included examples
- Experiment with different indicators
- Join the community discussions
# Analyze data
python -m interfaces.cli analyze --csv data.csv
# Run backtest
python -m interfaces.cli backtest --csv data.csv --train-window 500
# Run tests
pytest tests/
# Run linter
ruff check .
# Format code
black .
# Type checking
mypy core/GeoSync/
├── core/ # Core trading logic
│ ├── indicators/ # Technical indicators
│ ├── agent/ # Strategy optimization
│ ├── data/ # Data handling
│ └── phase/ # Regime detection
├── backtest/ # Backtesting engine
├── execution/ # Order execution
├── interfaces/ # CLI and APIs
├── tests/ # Test suite
└── docs/ # Documentation
If you see ModuleNotFoundError:
# Ensure you're in the project directory
cd /path/to/GeoSync
# Install in editable mode
pip install -e .If a package is missing:
# Reinstall all dependencies (dev extras include runtime stack)
pip install -r requirements-dev.lockOn Linux/macOS:
# Inspect available helper commands
python -m scripts --help- Check Troubleshooting Guide
- Read FAQ
- Open an issue on GitHub
You've successfully:
- ✅ Installed GeoSync
- ✅ Verified the installation
- ✅ Analyzed market data
- ✅ Run a backtest
- ✅ Explored indicators
- ✅ Created a custom strategy
Ready to dive deeper? Check out the full documentation.
Time to complete: ~5-10 minutes
Difficulty: Beginner
Last Updated: 2025-01-01