A memory benchmarking and analysis tool for CPython development, designed to track memory usage patterns across different commits and build configurations.
This project consists of three main components:
- Backend (
/backend/) - FastAPI application with SQLite database for data storage and API endpoints - Frontend (
/frontend/) - Next.js React application with rich data visualization and analysis tools - Worker (
/worker/) - Python CLI tool for running memory benchmarks on CPython commits
- Docker Engine 20.10+ and Docker Compose 2.0+
- CPython source repository (for benchmarking with the worker)
# Copy environment config
cp .env.example .env
# Build and start all services
docker compose -f docker-compose.dev.yml up --buildServices start automatically with hot reload:
- Frontend: http://localhost:9002
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/api/docs
# Via Docker (recommended)
docker compose -f docker-compose.dev.yml exec frontend npm run lint
docker compose -f docker-compose.dev.yml exec frontend npm run typecheck
# Or locally in the frontend directory
npm run lint # ESLint (must pass with zero errors)
npm run typecheck # TypeScript type checkingBoth checks run in CI on pushes to main and on pull requests.
docker compose -f docker-compose.dev.yml exec backend python scripts/populate_db.py# Edit backend/requirements.in, then regenerate both lockfiles:
docker run --rm -v "$(pwd)/backend:/app" -w /app python:3.13-slim-bookworm \
sh -c "pip install --quiet pip-tools && \
pip-compile --strip-extras --generate-hashes \
--output-file requirements.txt requirements.in && \
pip-compile --strip-extras --generate-hashes \
--output-file requirements-dev.txt requirements-dev.in"
# Rebuild the backend container:
docker compose -f docker-compose.dev.yml up --build -d backend# Set authentication token
export MEMORY_TRACKER_TOKEN=your_token_here
# List available binaries and environments
memory-tracker list-binaries
memory-tracker list-environments
# Run benchmarks on CPython commits
memory-tracker benchmark /path/to/cpython HEAD~5..HEAD \
--binary-id default \
--environment-id linux-x86_64
# Parallel processing with 4 workers
memory-tracker benchmark /path/to/cpython HEAD~10..HEAD \
--binary-id default \
--environment-id linux-x86_64 \
--max-workers 4
# Local checkout mode (sequential only)
memory-tracker benchmark /path/to/cpython HEAD~5..HEAD \
--binary-id default \
--environment-id linux-x86_64 \
--local-checkout# Development with hot reload
docker compose -f docker-compose.dev.yml up
# Production deployment
docker compose upRunning services directly on the host is possible but not recommended. Docker Compose ensures consistent Python/Node versions, database setup, and dependency isolation across all platforms.
- Python 3.13+
- Node.js 20+
make setup # Install deps, init DB, populate mock data
make dev # Start frontend + backend with hot reload
make test # Run backend tests
make reset-db # Drop and recreate database with fresh data
make populate-db # Populate the DB with mock data
make build # Build frontend for production
make clean # Clean up generated files and caches- Navigate to
/trendsto view memory usage over time - Filter by specific benchmarks or commit ranges
- Compare different Python build configurations
- Export charts for reports and presentations
- Go to
/difffor commit comparison - Select two commits to analyze
- View detailed memory usage differences
- Identify performance regressions or improvements
# Benchmark recent commits with parallel processing
memory-tracker benchmark ~/cpython HEAD~20..HEAD \
--binary-id optimized \
--environment-id linux-x86_64 \
--max-workers 8 \
--batch-size 4
# Force overwrite existing results
memory-tracker benchmark ~/cpython HEAD~10..HEAD \
--binary-id default \
--environment-id linux-x86_64 \
--force- Follow the existing code style and conventions
- Run tests before committing
- Use TypeScript for all frontend code
- Follow the repository patterns for new features
- Never commit secrets or authentication tokens
This project is licensed under the MIT License - see the LICENSE file for details.
- CPython - The Python programming language
- Memray - Memory profiler for Python
- pyperformance - Python performance benchmarking suite