After setting up dozens of Python projects, I've landed on a structure that works for everything from quick scripts to production services. Here's what I use and why.
Src Layout vs Flat Layout
There are two common ways to organize Python packages:
Flat layout:
myproject/
├── myproject/
│ └── __init__.py
├── tests/
└── pyproject.toml
Src layout:
myproject/
├── src/
│ └── myproject/
│ └── __init__.py
├── tests/
└── pyproject.toml
I prefer src layout for one critical reason: it prevents import shadowing bugs.
In flat layout, Python can import your local directory instead of the installed package. Your tests pass locally but fail in production. The src layout forces you to install the package properly, catching this early.
Use flat layout for simple scripts. Use src layout for anything you'll distribute or deploy.
pyproject.toml: One File to Rule Them All
Modern Python puts everything in pyproject.toml. No more juggling setup.py, setup.cfg, requirements.txt, and tool configs:
[project]
name = "myproject"
version = "0.1.0"
description = "A useful project"
requires-python = ">=3.11"
dependencies = [
"httpx>=0.27",
"pydantic>=2.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0",
"ruff>=0.5",
"mypy>=1.10",
]
[project.scripts]
myproject = "myproject.cli:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/myproject"]
[tool.pytest.ini_options]
testpaths = ["tests"]
pythonpath = ["src"]
[tool.ruff]
line-length = 100
target-version = "py311"
[tool.mypy]
python_version = "3.11"
strict = trueEverything in one place. Tool configuration, dependencies, build settings—all searchable, all version controlled.
Dependency Management: uv vs Poetry vs pip-tools
This is where opinions diverge. Here's my take after using all three:
uv (My Current Choice)
uv is the new Rust-based tool from the Astral team (creators of Ruff). It's fast—10-100x faster than pip.
# Create venv and install
uv venv
uv pip install -e ".[dev]"
# Or use uv's project management
uv sync # reads pyproject.toml, creates lockfilePros: Blazing fast, drop-in pip replacement, generates lockfiles, actively developed.
Cons: Newer tool, still evolving.
Poetry
Poetry was the first "modern" Python package manager. It bundles virtual env management, dependency resolution, and publishing:
poetry init
poetry add httpx
poetry installPros: Great UX, handles everything, strong ecosystem.
Cons: Uses its own poetry.lock format, can be slow on large projects, adds complexity.
pip-tools
pip-tools is minimal: you write requirements.in, it generates requirements.txt:
# requirements.in
httpx>=0.27
pydantic>=2.0
# Generate locked requirements
pip-compile requirements.in
pip install -r requirements.txtPros: Simple, works with standard pip, battle-tested.
Cons: Manual workflow, doesn't manage virtualenvs.
My Recommendation
For new projects: Use uv. It's fast, standards-compliant, and doesn't lock you into a proprietary format.
For existing Poetry projects: Keep using Poetry. Migration isn't worth it unless you're hitting performance issues.
For simple scripts: pip with pyproject.toml is enough.
Directory Structure by Project Type
CLI Application
mycli/
├── src/mycli/
│ ├── __init__.py
│ ├── __main__.py # python -m mycli
│ ├── cli.py # Click/Typer commands
│ └── core.py # Business logic
├── tests/
├── pyproject.toml
└── README.md
Library/Package
mylib/
├── src/mylib/
│ ├── __init__.py # Public API exports
│ ├── core.py
│ ├── utils.py
│ └── exceptions.py
├── tests/
├── docs/
├── pyproject.toml
└── README.md
Web Service (FastAPI/Flask)
myservice/
├── src/myservice/
│ ├── __init__.py
│ ├── main.py # App factory
│ ├── api/
│ │ ├── __init__.py
│ │ └── routes.py
│ ├── core/
│ │ ├── __init__.py
│ │ └── config.py
│ └── models/
│ ├── __init__.py
│ └── schemas.py
├── tests/
├── alembic/ # Migrations
├── pyproject.toml
└── Dockerfile
Monorepo with Multiple Packages
monorepo/
├── packages/
│ ├── core/
│ │ ├── src/core/
│ │ ├── tests/
│ │ └── pyproject.toml
│ └── cli/
│ ├── src/cli/
│ ├── tests/
│ └── pyproject.toml
├── pyproject.toml # Workspace config (uv/Poetry)
└── README.md
Testing Setup
Tests live in a tests/ directory at the project root, mirroring the source structure:
tests/
├── __init__.py
├── conftest.py # Shared fixtures
├── test_cli.py
└── test_core.py
conftest.py
This is pytest's fixture hub. Put shared fixtures here:
# tests/conftest.py
import pytest
from myproject.core import Database
@pytest.fixture
def db():
"""Provide a test database connection."""
database = Database(":memory:")
database.setup()
yield database
database.teardown()
@pytest.fixture
def sample_user():
"""Sample user data for tests."""
return {
"name": "Owen",
"email": "owen@example.com",
}Fixtures from conftest.py are automatically available to all tests in the directory.
Running Tests
# Basic run
pytest tests/ -v
# With coverage
pytest tests/ --cov=src/myproject --cov-report=term-missing
# Specific test
pytest tests/test_core.py::test_process_data -vConfigure pytest in pyproject.toml:
[tool.pytest.ini_options]
testpaths = ["tests"]
pythonpath = ["src"]
addopts = "-v --tb=short"My Complete Template
Here's what I start every project with:
myproject/
├── src/
│ └── myproject/
│ ├── __init__.py
│ ├── __main__.py
│ ├── cli.py
│ └── core.py
├── tests/
│ ├── __init__.py
│ ├── conftest.py
│ └── test_core.py
├── pyproject.toml
├── Makefile
├── README.md
└── .gitignore
With a simple Makefile:
.PHONY: setup test lint check
setup:
uv venv
uv pip install -e ".[dev]"
test:
pytest tests/ -v
lint:
ruff check src/ tests/
ruff format --check src/ tests/
check: lint testType make setup && make check and you're done.
Start Simple, Scale When Needed
Don't over-engineer from day one. A flat structure with three files is fine for a script. Add the src layout when you're distributing. Add subdirectories when you have 10+ modules.
The best project structure is one you'll actually maintain.