Awesome-omni-skill pytest
Python testing with pytest framework. This skill should be used when user asks to write tests, create test files, run tests, fix failing tests, add test coverage, set up fixtures, mock dependencies, or do any testing-related work for Python code. Triggers on requests like "write tests for this", "add unit tests", "test this function", "fix the failing test", "run pytest", "set up test fixtures", or "mock this dependency".
git clone https://github.com/diegosouzapw/awesome-omni-skill
T=$(mktemp -d) && git clone --depth=1 https://github.com/diegosouzapw/awesome-omni-skill "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/development/pytest" ~/.claude/skills/diegosouzapw-awesome-omni-skill-pytest-d9e944 && rm -rf "$T"
skills/development/pytest/SKILL.mdPytest Testing Skill
Write and run Python tests using pytest with Test-Driven Development (TDD).
Before Implementation
Gather context to ensure successful test implementation:
| Source | Gather |
|---|---|
| Codebase | Existing test structure, conftest.py patterns, fixtures in use |
| Conversation | What code to test, expected behavior, edge cases |
| Skill References | Testing patterns from directory |
| User Guidelines | Project testing conventions, coverage requirements |
Clarifications
Required (ask if not clear)
- Test type? Unit tests / Integration tests / End-to-end tests
- Framework being tested? FastAPI / Django / CLI / Pure Python
- Async code? Yes (need pytest-asyncio) / No
Optional (ask if relevant)
- Coverage target? 80% / 90% / 100% / No requirement
- Mocking needed? External APIs / Database / File system
Official Documentation
| Resource | URL | Use For |
|---|---|---|
| Pytest Docs | https://docs.pytest.org | Official reference |
| pytest-asyncio | https://pytest-asyncio.readthedocs.io | Async testing |
| pytest-cov | https://pytest-cov.readthedocs.io | Coverage reports |
| pytest-mock | https://pytest-mock.readthedocs.io | Mocking utilities |
| HTTPX Testing | https://www.python-httpx.org/async/ | FastAPI async testing |
Version Note: This skill follows pytest 7.x+ patterns. For older versions, check migration guides.
TDD Workflow (Red-Green-Refactor)
ALWAYS follow TDD when writing code:
The Cycle
🔴 RED → Write a failing test first 🟢 GREEN → Write minimal code to pass the test 🔄 REFACTOR → Clean up code, keep tests green
TDD Rules
- Never write code without a failing test first
- Write only enough code to make the test pass
- Refactor only when tests are green
Quick Example
# Step 1: 🔴 RED - Write failing test def test_add(): assert add(2, 3) == 5 # NameError: 'add' is not defined # Step 2: 🟢 GREEN - Minimal implementation def add(a, b): return a + b # Test passes! # Step 3: 🔄 REFACTOR - Improve if needed (tests stay green)
TDD for FastAPI Endpoints
# Step 1: 🔴 RED - Test first (endpoint doesn't exist) def test_get_user(client): response = client.get("/users/1") assert response.status_code == 200 assert response.json()["id"] == 1 # Step 2: 🟢 GREEN - Create endpoint @app.get("/users/{user_id}") def get_user(user_id: int): return {"id": user_id, "name": "Test User"} # Step 3: 🔄 REFACTOR - Add proper DB lookup, error handling
See
references/tdd.md for complete TDD workflow and patterns.
Quick Start
Write a Basic Test
def test_function_name(): result = function_under_test(input) assert result == expected
Run Tests
uv run pytest # Run all tests uv run pytest tests/test_file.py # Run specific file uv run pytest -k "test_name" # Run matching tests uv run pytest -x # Stop on first failure uv run pytest --lf # Run last failed only uv run pytest -v # Verbose output
Test File Structure
Place tests in
tests/ directory mirroring source structure:
project/ ├── src/ │ ├── api.py │ └── utils.py └── tests/ ├── conftest.py # Shared fixtures ├── test_api.py └── test_utils.py
Test files must start with
test_ or end with _test.py.
Writing Tests
Test Functions
def test_addition(): assert add(2, 3) == 5 def test_raises_error(): with pytest.raises(ValueError): validate("")
Test Classes
class TestCalculator: def test_add(self): calc = Calculator() assert calc.add(2, 3) == 5 def test_divide_by_zero(self): calc = Calculator() with pytest.raises(ZeroDivisionError): calc.divide(1, 0)
Fixtures
@pytest.fixture def client(): return TestClient(app) def test_endpoint(client): response = client.get("/api/users") assert response.status_code == 200
Parametrized Tests
@pytest.mark.parametrize("input,expected", [ ("hello", 5), ("", 0), ("world", 5), ]) def test_length(input, expected): assert len(input) == expected
Mocking
from unittest.mock import patch, Mock @patch("module.external_api") def test_with_mock(mock_api): mock_api.return_value = {"status": "ok"} result = fetch_data() assert result["status"] == "ok" mock_api.assert_called_once()
FastAPI Async Testing (httpx)
import pytest from httpx import ASGITransport, AsyncClient from app.main import app @pytest.mark.anyio async def test_async_endpoint(): async with AsyncClient( transport=ASGITransport(app=app), base_url="http://test" ) as ac: response = await ac.get("/") assert response.status_code == 200
See
references/fastapi_testing.md for dependency overrides, auth testing, WebSockets.
Common Markers
@pytest.mark.skip(reason="Not implemented") @pytest.mark.skipif(sys.version < "3.10", reason="Requires 3.10+") @pytest.mark.xfail(reason="Known bug") @pytest.mark.asyncio # For async tests (requires pytest-asyncio) @pytest.mark.slow # Custom marker for slow tests
Coverage
uv run pytest --cov=src --cov-report=term-missing uv run pytest --cov=src --cov-report=html
Resources
Scripts
- Run pytest with useful defaultsscripts/run_tests.py
- Generate test file stubs from sourcescripts/generate_tests.py
- Check coverage thresholds and report files below targetscripts/check_coverage.py
References
- TDD workflow, Red-Green-Refactor cycle, TDD patterns for APIsreferences/tdd.md
- Fixture patterns, mocking, async testing, database testingreferences/patterns.md
- Ready-to-use conftest.py templates for web apps, FastAPI, Django, CLI testingreferences/conftest_templates.md
- FastAPI-specific testing: dependency overrides, async testing with httpx, auth, WebSockets, background tasksreferences/fastapi_testing.md
- Popular pytest plugins with usage examplesreferences/plugins.md
- Common errors and how to fix themreferences/troubleshooting.md
- GitHub Actions and GitLab CI integrationreferences/ci_cd.md
Popular Plugins
| Plugin | Purpose | Install |
|---|---|---|
| pytest-cov | Code coverage | |
| pytest-asyncio | Async test support | |
| pytest-anyio | Alternative async support | |
| pytest-mock | Easier mocking | |
| pytest-xdist | Parallel testing | |
| pytest-httpx | Mock httpx requests | |
| pytest-env | Environment variables | |
| pytest-timeout | Test timeouts | |
See
references/plugins.md for detailed usage.
CI/CD Quick Start
GitHub Actions
- name: Install uv uses: astral-sh/setup-uv@v4 - name: Run tests run: uv run pytest --cov=src --cov-report=xml
GitLab CI
test: script: - pip install uv - uv run pytest --cov=src
See
references/ci_cd.md for complete examples.
Troubleshooting
| Error | Solution |
|---|---|
| Add or fix PYTHONPATH |
| Check conftest.py location |
| Add |
| Check test file/function naming |
See
references/troubleshooting.md for detailed solutions.
Common Mistakes
| Mistake | Why It's Wrong | Fix |
|---|---|---|
| Testing implementation, not behavior | Brittle tests that break on refactor | Test inputs → outputs |
Missing | Async test silently passes | Add marker for async tests |
| Hardcoded test data | Tests fail in different environments | Use fixtures and factories |
Not using | Duplicate fixtures across files | Centralize shared fixtures |
| Ignoring test isolation | Tests affect each other | Use fresh fixtures per test |
| Mocking too much | Tests don't catch real bugs | Mock only external dependencies |
Before Delivery Checklist
Test Quality
- Tests follow TDD (written before implementation)
- Each test tests one thing (single assertion focus)
- Tests are independent (can run in any order)
- Edge cases covered (empty, null, boundaries)
Coverage
- Coverage meets project requirements
- Critical paths have 100% coverage
- Run:
uv run pytest --cov --cov-report=term-missing
Organization
- Tests mirror source structure
- Shared fixtures in
conftest.py - Descriptive test names (
)test_<action>_<scenario>_<expected>
CI Ready
- All tests pass:
uv run pytest - No hardcoded paths or credentials
- Async tests properly marked