A command-line tool that answers yes/no questions using an LLM. It returns its answer as an exit code: 0 for true, 1 for false, 2 for error.
$ yeah does 2 + 2 equal 4? $ echo $? 0 $ yeah does 2 + 2 equal 5? $ echo $? 1 $ yeah is there a file called go.mod in this directory? $ echo $? 0
No output. No prompts. Just an exit code.
Install
go install github.com/crawshaw/yeah@latest
Configuration
All configuration is via environment variables:
| Variable | Description | Default |
|---|---|---|
YEAH_PROVIDER |
LLM provider: anthropic or openai |
anthropic |
YEAH_MODEL |
Model name | claude-sonnet-4-5-20250929 (anthropic) / gpt-4.1 (openai) |
YEAH_GATEWAY |
LLM gateway URL | exe.dev gateway |
ANTHROPIC_API_KEY |
Anthropic API key | not needed with gateway |
OPENAI_API_KEY |
OpenAI API key | not needed with gateway |
Examples
Use in shell scripts:
if yeah is the disk more than 90% full?; then echo "Disk space critical!" fi if yeah does main.go contain a function called main?; then echo "Found main function" fi if yeah is git installed on this system?; then git status fi
Safety
Like all LLM-powered tools, it is as safe as your LLM.
Which is to say, it is not safe. The warranty disclaimer in the LICENSE is doing a lot of work here.
However to make it a little less awful, on macOS yeah runs
under sandbox-exec with a deny-file-write policy, and uses
landlock on Linux for similar results.