Building a Python Library in 2026

10 min read Original article ↗

2026-03-28

So you want to build a Python library in 2026? Here's everything you need to know about the state of the art.

In 2025, I built a Python library for internal use at my company. We still use and maintain this library frequently. Modern tooling made this job way easier than it would have been 5 years ago. This blog post captures the lessons I learned.

I will do my best to focus on principles over dogmas. When relevant, I will be clear about the tools I have experience with vs the tools I haven't used.

Initialize Your Project

Python's ecosystem, more so than that of almost any other language, has grown. The decades of shifting consensus around build systems, type checking, linting, formatting, and publishing have only recently coalesced into uv from Astral. uv is beautiful in its ability to simplify everything about package development and maintenance. For that reason, uv will be the backbone of our modern Python library.

Install uv, and let's begin.

uv init --lib my-package

Folder Structure

By default, your package will have this folder structure:

.
├── .git
│   └── ...
├── .gitignore
├── pyproject.toml
├── .python-version
├── README.md
└── src
    └── my_package
        ├── __init__.py
        └── py.typed

pyproject.toml

pyproject.toml holds your project-related metadata. It conforms to the Python Packaging User Guide's pyproject.toml specification.

Review the [project] section for completeness. This section is used by PyPI to render your project listing. Refer to the Writing your pyproject.toml guide for help with important fields like project.description and project.authors. I've found it useful to let an LLM fill out the project.keywords and project.classifiers fields.

Become familiar with the SemVer standard for version numbering. SemVer version numbers have a major, minor, and patch number, separated by periods. Use uv to set the version number appropriately—don't manage it by hand.

uv version --bump {major,minor,patch}

src/my_package

Your library code will go in the src/my_package directory.

src/my_package/__init__.py designates your package as a module. It is typical to use this file to expose top-level imports through use of the __all__ magic variable. For example, the openai package uses it's top-level __init__.py to expose the OpenAI client, as well as a selection of commonly used types, exceptions, and constants. Don't worry about getting this exactly right from the beginning. Your linter will guide you to best practices.

Linting and Formatting

You need a linter. Your brain is too precious to waste on style and structure decisions. Let a linter take care of polishing the code.

You also need a formatter. Just like a linter, a formatter takes care of meaningless decisions, letting you work on the important things. Formatters also play an important role in git-based development. Without formatters, it's possible for trivial style diffs to trigger your CI pipelines.

I use Astral's ruff for linting and formatting. It's popular, but it's not the only option. It's not uncommon for older projects to use Flake8 (linting) and Black (formatting). Whichever tools you decide to use, add them to your project as dev dependencies.

uv add --dev ruff

Once added, you can run your tools with uv run.

uv run ruff format
uv run ruff check --fix

You will probably be running these commands a lot. Document these somewhere. I like using a command runner (like make or just). They're simple to use and self-documenting. But you can also get by with a folder of shell scripts or even just a good README.

# Makefile
lint:
    uv run ruff format

format:
    uv run ruff check --fix

Type Checking

No library built in 2026 can call itself "quality" without type annotations. In fact, your linter will (hopefully) complain if you try to skimp out on type signatures. Type signatures focus your work, sniff out bugs, and enable IDE superpowers.

You need some tool to check your annotations for bugs or errors. mypy is the go-to here, though Astral's ty and Meta's pyrefly are both strong contenders. VSCode ships with pyright. These tools differ slightly in their type-checking behavior, so it's worth trying them all out to see which one you prefer. Fortunately, uv makes it simple to test them all.

# Test out all three type checkers.
uv run --with mypy mypy src/
uv run --with ty ty check
uv run --with pyrefly pyrefly check

# Add the one you like
uv add --dev mypy
# Makefile
lint:
    uv run ruff check
    uv run mypy src/

If you're uncomfortable working with type annotations, I recommend anthonywritescode's typing-puzzles repository and YouTube series.

Testing

You need tests. Tests typically go into a top-level tests/ folder, often separated into subfolders like tests/unit and tests/integration.

There are a number of testing frameworks you can choose from. pytest is a sensible default, though you can get pretty far with the built-in unittest. I haven't tried others frameworks.

Whatever you use, try to incorporate code coverage into your tests. Code coverage gets a bad rap, but code coverage requirements have helped me catch more bugs than I care to commit.

uv add --dev pytest pytest-cov

Save your test commands with the rest of your development scripts. You will likely have multiple commands for different levels of testing.

# Makefile
# Unit tests. These run quickly, so 
# run them often.
test:
    uv run pytest --cov=my_package \
    --cov-report=term-missing tests/unit

# Integration tests. Run before each
# release.
test-integration:
    uv run pytest \
     tests/unit tests/integration

Supporting multiple versions

I test my package against as many Python versions as it supports. A lot of performance-critical packages rely on C extensions, which can occasionally break with new releases of Python. The traditional solutions to this challenge are tox and nox. Fortunately, uv makes it trivial to test libraries against any Python version, without extra dependencies. Just use the --python flag.

# Makefile
test-versions:
  uv python find 3.12 \
    || uv python install 3.12
  uv run --python 3.12 pytest \
    tests/unit
  uv python find 3.13 \
    || uv python install 3.13
  uv run --python 3.13 pytest \
    tests/unit

Maintaining Code Quality

CI

Without a way to strictly enforce linting, formatting, type checking, and testing, your code quality will slowly deteriorate. This is especially true if you develop your library collaboratively. Use CI/CD to enforce code quality at release time.

I use GitHub Actions, though writing safe GitHub Actions is non-trivial, and GitHub Actions has been the target of several devastating supply-chain attacks. The latest trend has been to migrate to Codeberg with Forgejo Actions. I haven't used it myself.

At the very least, your actions should run your check scripts before deploying a new package version. Here is a super simple example:

#!/usr/bin/env bash

uv sync

# Current release's version
CURR_VER=$(uv version --short)

# Get prior commit's package version
git restore -s HEAD~ -W pyproject.toml
PRIOR_VER=$(uv version --short)
git restore .

if [ "$CURR_VER" = "$PRIOR_VER" ]
  then
  # Package version number didn't 
  # change; no action needed.
  exit
fi

# Check code quality
uv run ruff check
uv run ruff format --check
uv run mypy src/
uv run pytest tests/

# Build and publish
uv build
uv publish

Be careful about running GitHub Actions from the marketplace. It is typically much safer to vendor your actions into the repository.

Pre-commit

CI scripts are great for enforcing code quality, but they can really suck from a DX perspective. People don't want to submit PRs, wait several minutes, then get a build failure because they forgot to run the formatter. Rather, help developers keep their code clean as they develop with git hooks.

Creating git hooks can be as easy as adding uv run ruff check to a shell script at .git/hooks/pre-commit. However, since hooks don't get saved to the repository, hooks prepared manually in this way wind up being difficult to reproduce and distribute.

The community's solution to this problem is the pre-commit package. pre-commit enables source-tracked git hooks with simple installation scripts. The community has already set up a number of pre-commit scripts to help with Python development. For example, you can use astral-sh/ruff-pre-commit to run the ruff linter and formatter before every commit.

# Set up pre-commit
uv add --dev pre-commit
uv run pre-commit sample-config
# ... add your hooks to 
# .pre-commit-config.yaml ...

# Install the git hooks
uv run pre-commit install

You should also have the pre-commit installation script documented with the rest of the development scripts.

# Makefile
setup:
    uv sync
    uv run pre-commit install

Publishing your Package

More than anything, uv is a build tool. You can't do better than uv's own documentation for building and publishing your packages. If you are publishing to PyPI or an internal package repository, follow their guide. However, with uv, it's not strictly necessary to publish your packages to a registry!

My company is small. Our development team only has three active Python developers. It didn't make sense to spin up a custom Python package registry just for us three. However, for compliance reasons, we couldn't publish directly to PyPI, either. Instead, we install our libraries from source. All of us already have the library's git repository on our computers. To use the library in a personal project, we simply needed to run uv add /path/to/my-package. uv handles building, caching, and linking the library. All we needed was a cronjob for keeping the repository up to date.

cd ~/my-personal-project/
uv add ~/my-package

The Final Product

After following the steps above, your repository should look something like this:

  • .git/hooks/pre-commit - pre-commit hooks for maintaining code quality; perhaps configured via .pre-commit-config.yaml
  • src/my_package/ - your library, with helpful imports and quality documentation
  • tests/ - unit, integration, and other test, run via pytest or similar test runner
  • pyproject.toml - useful, current metadata about your package
  • {Makefile,justfile,scripts/} - script runner, for documenting the most important commands
  • {.github,.forgejo}/workflows - CI validation scripts; many alternatives exist

Learning from Others

Let's see how real-world companies with high-impact projects use these tools in their libraries.

openai/openai-python

  • Linting: ruff
  • Formatting: black
  • Commands: collection of shell scripts/
  • Build: hatchling, via rye
  • Testing: pytest, with nox for testing across versions
  • Type Checking: both pyright and mypy
  • CI: GitHub Actions

The OpenAI Python SDK was created in 2020, so we can almost forgive them for using the deprecated predecessor to uv. Hopefully, they will upgrade to uv soon, given that they acquisition of Astral.

This project uses both pyright and mypy for type-checking—a great example of how to handle different engines' subtle variations in behavior.

pola-rs/polars

  • Linting: ruff
  • Formatting: ruff
  • Commands: Makefile
  • Build: setuptools
  • Testing: pytest with coverage
  • Type Checking: mypy
  • CI: GitHub Actions

The polars package is unusual in that it is a Rust package with Python bindings. This constrains the build system. Despite this, uv is perfectly adequate for their needs.

This list is short. Are there any other projects with notable stacks that I should include? Shoot me an email or respond to this thread in HackerNews.

Stephen Funk