This module introduces thread-based coordination with locks and queues in Python.
- Difficulty: Advanced.
- Estimated Time: 45-60 minutes.
- Prerequisites:
01-foundations/control-flow,03-advanced/structs-and-classes. - Cross-Language Lens: Compare
std::thread,Task, goroutines, and Python threads as different concurrency building blocks.
python example/main.py- Starting worker threads.
- Waiting for workers with
join. - Protecting shared state with
threading.Lock. - Coordinating producer-consumer flow with
queue.Queue.
- Assuming the GIL removes the need for coordination design.
- Updating shared state without a lock.
- Forgetting to join worker threads before reading final results.
- Leaving consumer threads blocked without a completion signal.
- Python can express concurrency clearly, but the runtime model differs from C++ and C# because of the GIL and its effect on CPU-bound work.
- Compared with Go or TypeScript, this track makes it easier to discuss why concurrency goals and parallel speedups are not the same thing.
- The main comparison is orchestration clarity versus actual parallel execution.
- exercises/01.py: parallel chunk sum with worker threads.
- exercises/02.py: producer-consumer queue with
queue.Queue.
- exercises/01.py
- Input: one line of integers, then worker count.
- Output: per-worker partial sums and final total.
- Edge cases: worker count larger than list size; non-positive worker count; empty list.
- exercises/02.py
- Input: number of items to produce.
- Output: produced/consumed item logs and completion summary.
- Edge cases: zero items; consumer waiting on an empty queue.
- I can start threads and wait for them to finish safely.
- I can protect shared state with a lock.
- I can use a queue to coordinate a producer and consumer.
- I completed exercises/01.py.
- I completed exercises/02.py.