Neural networks from scratch, built piece by piece from scalar operations up to complete architectures.
- Python standard library only — no numpy, torch, tensorflow, or any ML/data framework
- matplotlib is the sole exception — allowed for visualization only (in
src/modelwerk/viz/) - Compositional layering — each level imports only from levels below:
- L0: scalar ops → L1: vector/matrix ops → L2: activations/losses → L3: neuron → L4: layers → L5: network → L6: gradients/optimizers → L7: models
- Types:
list[float]for vectors,list[list[float]]for matrices, dataclasses for structured objects - All randomness goes through
src/modelwerk/primitives/random.pywith explicit seeds
uv run python lessons/01_perceptron.py— run a lessonuv run pytest tests/— run tests
src/modelwerk/primitives/— scalar, vector, matrix ops, activations, lossessrc/modelwerk/building_blocks/— neuron, layers, network, backprop, optimizerssrc/modelwerk/models/— complete architectures (perceptron, mlp, lenet5, transformer)src/modelwerk/data/— dataset generation and loadingsrc/modelwerk/viz/— matplotlib visualizationslessons/— runnable scripts, one per model, with narrative explanations