GitHub - ThatXliner/xclif: Xliner's Command Line Interface Framework

4 min read Original article ↗

uv Code style: black Ruff Imports: isort Checked with mypy codecov

CI PyPI - Python Version PyPI PyPI - License Read the Docs

Xliner's CLI Framework

Read the Manifesto to understand why Xclif exists and how it compares to Click, Typer, and argparse.

Installation

Or with uv:

Quick Start

Your directory structure is your command tree:

myapp/
├── __init__.py
├── __main__.py
└── routes/
    ├── __init__.py       →  myapp
    ├── greet.py          →  myapp greet
    └── server/
        ├── __init__.py   →  myapp server
        ├── start.py      →  myapp server start
        └── stop.py       →  myapp server stop
# routes/greet.py
from xclif import command

@command()
def _(name: str, template: str = "Hello, {}!") -> None:
    """Greet someone by name."""
    print(template.format(name))
# __main__.py
from xclif import Cli
from . import routes

cli = Cli.from_routes(routes)
if __name__ == "__main__":
    cli()

No default → positional argument. Has default → --template option. Docstring → help text. Drop a file in the right folder and the command exists.

Features

  • File-based routing — directory structure is the command tree
  • Decorator + type-hint API — function signatures define the CLI contract
  • Rich integration — beautiful help pages, formatted errors, progress indicators
  • Built-in logging and verbosity--verbose / -v wired up automatically
  • Config managementWithConfig[T] reads from config files or environment variables (CLI flag > env var > config file > default)
  • Agent-optimized help — auto-detects non-TTY output and emits a compact, token-efficient format for LLM agents and scripts
  • Autogenerated shell completions — bash, zsh, fish
  • Minimal overhead — custom parser built from scratch for fast startup; xclif compile pre-builds a static manifest to eliminate route-walking cost
  • Plugin discovery (planned) — third-party subcommands via entry points (like Git or cargo)
  • Easy testingcommand.execute(["greet", "Alice"]) with explicit arg lists, no mocking needed

Performance

Performance is not a primary focus of Xclif. If startup latency is a hard constraint, Python is probably the wrong tool for the job. These numbers are here for fun. That being said, we do recommend switching to a manifest-based setup for large codebases.

Startup time

Benchmarked on macOS (Apple Silicon, Python 3.12, 30 iterations + 3 warmup, wall-clock subprocess time):

Scenario Click Typer Xclif (from_routes) Xclif (flat) Xclif (manifest)
greet World 28.0 39.8 41.0 27.2 26.9
greet + options 27.9 37.8 40.9 26.3 26.7
config set 27.6 38.0 40.5 26.1 26.9
config get 27.9 38.1 40.3 26.2 26.8
--help 29.2 82.2 59.6 46.6 47.2
greet --help 29.7 83.8 59.3 46.8 47.6

Xclif (flat) uses the decorator API (Command.command() / Command.group()) instead of from_routes, and is the fastest framework for command execution — edging out Click by ~1–2 ms. The --help gap (~20 ms vs Click) is Rich's lazy-import cost. If better scaling is desirable for large codebases, the manifest compiler pre-builds a static manifest that matches flat API performance without requiring manual command registration.

Typer is the slowest overall: it wraps Click with extra overhead, and its Rich-based help rendering adds ~52–53 ms on --help scenarios.

from_routes adds ~13–15 ms for the package walker on top of Xclif (flat) — the trade-off for zero-registration file-based routing.

from_manifest eliminates that cost. Running xclif compile myapp.routes once generates a _xclif_manifest.py next to your routes package; loading it with Cli.from_manifest() skips the filesystem walk entirely, matching flat API performance.

# __main__.py — manifest variant
from xclif import Cli
from myapp import _xclif_manifest

cli = Cli.from_manifest(_xclif_manifest)
if __name__ == "__main__":
    cli()
# regenerate after adding or removing routes
xclif compile myapp.routes

To reproduce (requires hyperfinebrew install hyperfine on macOS):

bash benchmarks/bench_frameworks.sh

Parse and dispatch latency

Measured in-process (no subprocess, no import cost) on the same machine, 5 000 iterations:

Scenario Click Typer Xclif
greet World 72 µs 496 µs 2.7 µs
greet + options 77 µs 490 µs 4.0 µs
config set 82 µs 578 µs 3.4 µs
config get 79 µs 553 µs 3.2 µs
--help 131 µs 1 879 µs 483 µs
greet --help 142 µs 2 281 µs 553 µs

Xclif's custom parser is ~25× faster than Click and ~170× faster than Typer for command dispatch. The --help cases are slower than Click because Rich is doing real formatting work; Typer is dramatically slower there due to its reflective help generation.

uv run python benchmarks/bench_parsing.py

Contributing

See CONTRIBUTING.md.

License

This project is licensed under the MIT license.