4.4 KiB
4.4 KiB
TaskFlow (Local Task Tracker)
Minimalistic local task tracker implementing a simple workflow:
Backlog → In Progress → Done
The service exposes a clean HTTP API, provides workflow metrics, persists state to disk, and includes a lightweight HTML dashboard.
Tech Stack
- Python 3.14
- FastAPI 0.135
- Pydantic v2
- Pytest
- Docker (local runtime)
Features
- CRUD operations for tasks
- Workflow transitions with validation:
- backlog → in_progress → done
- Idempotent
startoperation - Metrics:
- Lead Time
- Cycle Time
- CSV export with injection protection
- Persistent storage (
data/tasks.json) - HTML dashboard (
/stats) - Atomic file writes with corruption recovery
Project Structure
.
├── app/ # Application code
│ ├── api/ # Transport layer (FastAPI routers)
│ ├── domain/ # Business logic
│ └── storage/# Persistence layer
├── data/ # Persistent storage (tasks.json)
├── tests/ # Test suite
└── Dockerfile
Architecture
The project follows a layered architecture:
1. Transport Layer
- FastAPI routers
- Request/response validation (Pydantic)
- HTTP error mapping
2. Domain Layer
- Business rules
- Workflow transitions
- Metrics calculation
3. Storage Layer
- File-based persistence
- Atomic writes (tmp + rename)
- Corruption recovery (backup + reset)
Domain Model
Task
{
"id": "uuid",
"title": "string (max 100)",
"created_at": "datetime (UTC ISO-8601)",
"started_at": "datetime | null",
"done_at": "datetime | null"
}
Status Derivation
Status is not stored explicitly:
- backlog →
started_at == null - in_progress →
started_at != null && done_at == null - done →
done_at != null
API Endpoints
Tasks
List tasks
GET /api/tasks
Query params:
limit(default: 100)status(optional: backlog | in_progress | done)search(optional substring match)
Get task by ID
GET /api/tasks/{id}
Create task
POST /api/tasks
Update task
PATCH /api/tasks/{id}
Delete task
DELETE /api/tasks/{id}
Workflow Actions
Start task
POST /api/tasks/{id}/start
Rules:
- Idempotent (multiple calls do not change
started_at) - Valid only if task is in backlog
Complete task
POST /api/tasks/{id}/done
Rules:
- Allowed only from
in_progress - Enforces workflow integrity
Export
GET /api/tasks/export
- Returns
text/csv - Includes header row
- Protects against CSV injection (
=,+,-,@)
Stats
GET /stats
HTML page:
-
Top block:
- Selected task details:
- Title
- Start datetime
- Done datetime
- Cycle time
- Selected task details:
-
Bottom:
- Kanban board:
- Backlog
- In Progress
- Done
- Kanban board:
Error Format
All errors follow unified format:
{
"error": "invalid_*",
"message": "human readable description"
}
Examples:
invalid_idinvalid_payloadinvalid_transitioninvalid_transaction
HTTP codes:
- 2xx — success
- 4xx — client errors (validation, transitions)
Data Persistence
- File:
data/tasks.json - Writes are atomic:
- Write to temp file
- Rename
Corruption Handling
If JSON is invalid:
- Backup corrupted file
- Reset to empty state
- Service continues running
Metrics
- Lead Time =
done_at - created_at - Cycle Time =
done_at - started_at
Average values are computed across completed tasks.
Testing Requirements
Test suite must cover:
- CRUD operations and filters
- Workflow transitions:
- valid transitions → 2xx
- invalid →
409 invalid_transaction
- Metrics correctness
- CSV export:
text/csv- correct header
- Persistence between restarts
- Performance:
- typical requests ≤ 100 ms
Running with Docker
Build and run
docker build -t taskflow .
docker run -p 8000:8000 -v $(pwd)/data:/app/data taskflow
One-command run (recommended)
docker compose up --build
Conventions
- All timestamps are UTC (ISO-8601)
- Validation via Pydantic v2
- Strict API contracts
- No implicit state mutations
- Idempotent operations where required
Goal
The project is considered complete when:
- The service runs in Docker with a single command
- All tests pass
- State persists across restarts
- API behaves according to specification