DNS Benchmark Tool
Part of BuildTools - Network Performance Suite
Fast, comprehensive DNS performance testing with DNSSEC validation, DoH/DoT support, and enterprise features
pip install dns-benchmark-tool dns-benchmark benchmark --use-defaults --formats csv,excel
π 1,400+ downloads this week! Thank you to our growing community.
π’ Want multi-region testing? Join the waitlist β
Real Time Tracking
Weβve added three powerful CLI commands to make DNS benchmarking even more versatile:
-
π top β quick ranking of resolvers by speed and reliability
-
π compare β sideβbyβside benchmarking with detailed statistics and export options
-
π monitoring β continuous performance tracking with alerts and logging
# Quick resolver ranking dns-benchmark top # Compare resolvers side-by-side dns-benchmark compare Cloudflare Google Quad9 --show-details # Run monitoring for 1 hour with alerts dns-benchmark monitoring --use-defaults --formats csv,excel --interval 30 --duration 3600 \ --alert-latency 150 --alert-failure-rate 5 --output monitor.log
π Community Highlights
- β Stars: Grew from 7 β 110+ after posting on Hacker News
- π¦ Downloads: Rebounded to 200+/day after initially stalling
- π Mastodon: Shared there too, but the real surge came from HN
- π¬ Feedback: Constructive input from HN community directly shaped patches v0.3.0 β v0.3.1
- π Takeaway: Hacker News visibility was the catalyst for adoption momentum
Table of Contents
- DNS Benchmark Tool
- Part of BuildTools - Network Performance Suite
- π Todayβs Release Highlights
- π Community Highlights
- Table of Contents
- π― Why This Tool?
- Quick start
- β¨ Key Features
- π§ Advanced Capabilities
- πΌ Use Cases
- π¦ Installation & Setup
- π Usage Examples
- Inline input support for resolvers and domains
- π§ Utilities
- Complete usage guide
- π README Adjustments for Final Patch
- β‘ CLI Commands
- π Analysis Enhancements
- β‘ Best Practices
- Feedback & Community Input
- βοΈ Configuration Files
- Output formats
- Performance optimization
- Troubleshooting
- Automation & CI
- Screenshots
- Getting help
- Release workflow
- π Hosted Version (Coming Soon)
- π£οΈ Roadmap
- π€ Contributing
- β FAQ
- π Links & Support
- License
π― Why This Tool?
DNS resolution is often the hidden bottleneck in network performance. A slow resolver can add hundreds of milliseconds to every request.
The Problem
- β±οΈ Hidden Bottleneck: DNS can add 300ms+ to every request
- π€· Unknown Performance: Most developers never test their DNS
- π Location Matters: "Fastest" resolver depends on where YOU are
- π Security Varies: DNSSEC, DoH, DoT support differs wildly
The Solution
dns-benchmark-tool helps you:
- π Find the fastest DNS resolver for YOUR location
- π Get real data - P95, P99, jitter, consistency scores
- π‘οΈ Validate security - DNSSEC verification built-in
- π Test at scale - 100+ concurrent queries in seconds
Perfect For
- β Developers optimizing API performance
- β DevOps/SRE validating resolver SLAs
- β Self-hosters comparing Pi-hole/Unbound vs public DNS
- β Network admins running compliance checks
Quick start
Installation
pip install dns-benchmark-tool
Run Your First Benchmark
# Test default resolvers against popular domains
dns-benchmark benchmark --use-defaults --formats csv,excelView Results
Results are automatically saved to ./benchmark_results/ with:
- Summary CSV with statistics
- Detailed raw data
- Optional PDF/Excel reports
That's it! You just benchmarked 5 DNS resolvers against 10 domains.
β¨ Key Features
π Performance
- Async queries - Test 100+ resolvers simultaneously
- Multi-iteration - Run benchmarks multiple times for accuracy
- Statistical analysis - Mean, median, P95, P99, jitter, consistency
- Cache control - Test with/without DNS caching
π Security & Privacy
- DNSSEC validation - Verify cryptographic trust chains
- DNS-over-HTTPS (DoH) - Encrypted DNS benchmarking
- DNS-over-TLS (DoT) - Secure transport testing
- DNS-over-QUIC (DoQ) - Experimental QUIC support
π Analysis & Export
- Multiple formats - CSV, Excel, PDF, JSON
- Visual reports - Charts and graphs
- Domain statistics - Per-domain performance analysis
- Error breakdown - Identify problematic resolvers
π’ Enterprise Features
- TSIG authentication - Secure enterprise queries
- Zone transfers - AXFR/IXFR validation
- Dynamic updates - Test DNS write operations
- Compliance reports - Audit-ready documentation
π Cross-Platform
- Linux, macOS, Windows - Works everywhere
- CI/CD friendly - JSON output, exit codes
- IDNA support - Internationalized domain names
- Auto-detection - Windows WMI DNS discovery
π§ Advanced Capabilities
β οΈ These flags are documented for visibility but not yet implemented.
They represent upcoming advanced features.
--dohβ DNS-over-HTTPS benchmarking (coming soon)--dotβ DNS-over-TLS benchmarking (coming soon)--doqβ DNS-over-QUIC benchmarking (coming soon)--dnssec-validateβ DNSSEC trust chain validation (coming soon)--zone-transferβ AXFR/IXFR zone transfer testing (coming soon)--tsigβ TSIG-authenticated queries (coming soon)--idnaβ Internationalized domain name support (coming soon)
π Performance & Concurrency Features
- Async I/O with dnspython - Test 100+ resolvers simultaneously
- Trio framework support - High-concurrency async operations
- Configurable concurrency - Control max concurrent queries
- Retry logic - Exponential backoff for failed queries
- Cache simulation - Test with/without DNS caching
- Multi-iteration benchmarks - Run tests multiple times for accuracy
- Warmup phase - Pre-warm DNS caches before testing
- Statistical analysis - Mean, median, P95, P99, jitter, consistency scores
Example:
dns-benchmark benchmark \ --max-concurrent 200 \ --iterations 5 \ --timeout 3.0 \ --warmup
π Security & Privacy Features
- DNSSEC validation - Verify cryptographic trust chains
- DNS-over-HTTPS (DoH) - Encrypted DNS benchmarking via HTTPS
- DNS-over-TLS (DoT) - Secure transport layer testing
- DNS-over-QUIC (DoQ) - Experimental QUIC protocol support
- TSIG authentication - Transaction signatures for enterprise DNS
- EDNS0 support - Extended DNS features and larger payloads
Example:
# Test DoH resolvers
dns-benchmark benchmark \
--doh \
--resolvers doh-providers.json \
--dnssec-validateπ’ Enterprise & Migration Features
- Zone transfers (AXFR/IXFR) - Full and incremental zone transfer validation
- Dynamic DNS updates - Test DNS write operations and updates
- EDNS0 support - Extended DNS options, client subnet, larger payloads
- Windows WMI integration - Auto-detect active system DNS settings
- Compliance reporting - Generate audit-ready PDF/Excel reports
- SLA validation - Track uptime and performance thresholds
Example:
# Validate DNS migration dns-benchmark benchmark \ --resolvers old-provider.json,new-provider.json \ --zone-transfer \ # coming soon --output migration-report/ \ --formats pdf,excel
π Analysis & Reporting Features
- Per-domain statistics - Analyze performance by domain
- Per-record-type stats - Compare A, AAAA, MX, TXT, etc.
- Error breakdown - Categorize and count error types
- Comparison matrices - Side-by-side resolver comparisons
- Trend analysis - Performance over time (with multiple runs)
- Best-by-criteria - Find best resolver by latency/reliability/consistency
Example:
# Detailed analysis
dns-benchmark benchmark \
--use-defaults \
--formats csv,excel \
--domain-stats \
--record-type-stats \
--error-breakdown \
--formats csv,excel,pdfπ Internationalization & Compatibility
- IDNA support - Internationalized domain names (IDN)
- Multiple record types - A, AAAA, MX, TXT, CNAME, NS, SOA, PTR, SRV, CAA
- Cross-platform - Linux, macOS, Windows (native support)
- CI/CD integration - JSON output, proper exit codes, quiet mode
- Custom resolvers - Load from JSON, test your own DNS servers
- Custom domains - Test against your specific domain list
Example:
# Test internationalized domains
dns-benchmark benchmark \
--domains international-domains.txt \
--record-types A,AAAA,MX \
--resolvers custom-resolvers.jsonπ‘ Most users only need basic features. These advanced capabilities are available when you need them.
πΌ Use Cases
π§ For Developers: Optimize API Performance
# Find fastest DNS for your API endpoints
dns-benchmark benchmark \
--domains api.myapp.com,cdn.myapp.com \
--record-types A,AAAA \
--resolvers production.json \
--iterations 10Result: Reduce API latency by 100-300ms
π‘οΈ For DevOps/SRE: Validate Before Migration
# Test new DNS provider before switching dns-benchmark benchmark \ --resolvers current-dns.json,new-dns.json \ --use-defaults \ --dnssec-validate \ # coming soon --output migration-report/ \ --formats csv,excel
Result: Verify performance and security before migration
π For Self-Hosters: Prove Pi-hole Performance
# Compare Pi-hole against public resolvers (coming soon)
dns-benchmark compare \
--resolvers pihole.local,1.1.1.1,8.8.8.8,9.9.9.9 \
--domains common-sites.txt \
--rounds 10Result: Data-driven proof your self-hosted DNS is faster (or not!)
π For Network Admins: Automated Health Checks
# Add to crontab for monthly reports 0 0 1 * * dns-benchmark benchmark \ --use-defaults \ --output /var/reports/dns/ \ --formats excel,csv \ --domain-stats \ --error-breakdown
Result: Automated compliance and SLA reporting
π For Privacy Advocates: Test Encrypted DNS
# Benchmark privacy-focused DoH/DoT resolvers dns-benchmark benchmark \ --doh \ # coming soon --resolvers privacy-resolvers.json \ --domains sensitive-sites.txt \ --dnssec-validate
Result: Find fastest encrypted DNS without sacrificing privacy
π¦ Installation & Setup
Requirements
- Python 3.9+
- pip package manager
Install from PyPI
pip install dns-benchmark-tool
Install from Source
git clone https://github.com/frankovo/dns-benchmark-tool.git cd dns-benchmark-tool pip install -e .
Verify Installation
dns-benchmark --version dns-benchmark --help
First Run
# Test with defaults (recommended for first time)
dns-benchmark benchmark --use-defaults --formats csv,excelπ Usage Examples
Basic Usage
# Basic test with progress bars dns-benchmark benchmark --use-defaults --formats csv,excel # Basic test without progress bars dns-benchmark benchmark --use-defaults --formats csv,excel --quiet # Test with custom resolvers and domains dns-benchmark benchmark --resolvers data/resolvers.json --domains data/domains.txt # Quick test with only CSV output dns-benchmark benchmark --use-defaults --formats csv
Advanced Usage
# Export a machine-readable bundle dns-benchmark benchmark --use-defaults --json --output ./results # Test specific record types dns-benchmark benchmark --use-defaults --formats csv,excel --record-types A,AAAA,MX # Custom output location and formats dns-benchmark benchmark \ --use-defaults \ --output ./my-results \ --formats csv,excel # Include detailed statistics dns-benchmark benchmark \ --use-defaults \ --formats csv,excel \ --record-type-stats \ --error-breakdown # High concurrency with retries dns-benchmark benchmark \ --use-defaults \ --formats csv,excel \ --max-concurrent 200 \ --timeout 3.0 \ --retries 3 # Website migration planning dns-benchmark benchmark \ --resolvers data/global_resolvers.json \ --domains data/migration_domains.txt \ --formats excel,pdf \ --output ./migration_analysis # DNS provider selection dns-benchmark benchmark \ --resolvers data/provider_candidates.json \ --domains data/business_domains.txt \ --formats csv,excel \ --output ./provider_selection # Network troubleshooting dns-benchmark benchmark \ --resolvers "192.168.1.1,1.1.1.1,8.8.8.8" \ --domains "problematic-domain.com,working-domain.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshooting # Security assessment dns-benchmark benchmark \ --resolvers data/security_resolvers.json \ --domains data/security_test_domains.txt \ --formats pdf \ --output ./security_assessment # Performance monitoring dns-benchmark benchmark \ --use-defaults \ --formats csv \ --quiet \ --output /var/log/dns_benchmark/$(date +%Y%m%d_%H%M%S) # New top commands # Run a basic benchmark (default: rank by latency) dns-benchmark top # β Tests all resolvers with sample domains, ranks by latency # Limit the number of resolvers shown dns-benchmark top --limit 5 # β Shows only the top 5 resolvers # Rank by success rate dns-benchmark top --metric success # β Ranks resolvers by highest success rate # Rank by reliability (combined score: success rate + latency) dns-benchmark top --metric reliability # β Uses weighted score to rank resolvers # Filter resolvers by category dns-benchmark top --category privacy dns-benchmark top --category family dns-benchmark top --category security # β Tests only resolvers in the specified category # Use a custom domain list dns-benchmark top --domains domains.txt # β Loads domains from a text file instead of built-in sample list # Specify DNS record types dns-benchmark top --record-types A,AAAA,MX # β Queries multiple record types (comma-separated) # Adjust timeout and concurrency dns-benchmark top --timeout 3.0 --max-concurrent 50 # β Sets query timeout to 3 seconds and limits concurrency to 50 # Export results to JSON dns-benchmark top --output results.json # β Saves results in JSON format # Export results to CSV dns-benchmark top --output results.csv # β Saves results in CSV format # Export results to TXT dns-benchmark top --output results.txt # β Saves results in plain text format # Quiet mode (no progress bar, CI/CD friendly) dns-benchmark top --quiet # β Suppresses progress output # Example combined usage dns-benchmark top --limit 10 --metric reliability --category privacy --output top_resolvers.csv # β Benchmarks privacy resolvers, ranks by reliability, shows top 10, exports to CSV # New compare commaands # Comparison of resolvers by name dns-benchmark compare Cloudflare Google Quad9 # ^ Compares Cloudflare, Google, and Quad9 resolvers using default domains and record type A # Basic compare resolvers by IP address dns-benchmark compare 1.1.1.1 8.8.8.8 9.9.9.9 # ^ Directly specify resolver IPs instead of names # Increase iterations for more stable results dns-benchmark compare "Cloudflare" "Google" --iterations 5 # ^ Runs 5 rounds of queries per resolver/domain/record type # Use a custom domain list from file dns-benchmark compare Cloudflare Google -d ./data/domains.txt # ^ Loads domains from domains.txt instead of sample domains # Query multiple record types dns-benchmark compare Cloudflare Google -t A,AAAA,MX # ^ Tests A, AAAA, and MX records for each domain # Adjust timeout and concurrency dns-benchmark compare Cloudflare Google --timeout 3.0 --max-concurrent 200 # ^ Sets query timeout to 3 seconds and allows 200 concurrent queries # Export results to JSON dns-benchmark compare Cloudflare Google -o results.json # ^ Saves comparison summary to results.json # Export results to CSV dns-benchmark compare Cloudflare Google -o results.csv # ^ Saves comparison summary to results.csv (via CSVExporter) # Suppress progress output dns-benchmark compare Cloudflare Google --quiet # ^ Runs silently, only prints final results # Show detailed per-domain breakdown dns-benchmark compare Cloudflare Google --show-details # ^ Prints average latency and success counts per domain for each resolver # New monitoring commands # Start monitoring with default resolvers and sample domains dns-benchmark monitoring --use-defaults # ^ Runs indefinitely, checking every 60s, using built-in resolvers and 5 sample domains # Monitor with a custom resolver list from JSON dns-benchmark monitoring -r resolvers.json --use-defaults # ^ Loads resolvers from resolvers.json, domains from defaults # Monitor with a custom domain list dns-benchmark monitoring -d domains.txt --use-defaults # ^ Uses default resolvers, but domains are loaded from domains.txt # Change monitoring interval to 30 seconds dns-benchmark monitoring --use-defaults --interval 30 # ^ Runs checks every 30 seconds instead of 60 # Run monitoring for a fixed duration (e.g., 1 hour = 3600 seconds) dns-benchmark monitoring --use-defaults --duration 3600 # ^ Stops automatically after 1 hour # Set stricter alert thresholds dns-benchmark monitoring --use-defaults --alert-latency 150 --alert-failure-rate 5 # ^ Alerts if latency >150ms or failure rate >5% # Save monitoring results to a log file dns-benchmark monitoring --use-defaults --output monitor.log # ^ Appends results and alerts to monitor.log # Combine options: custom resolvers, domains, interval, duration, and logging dns-benchmark monitoring -r resolvers.json -d domains.txt -i 45 --duration 1800 -o monitor.log # ^ Monitors resolvers from resolvers.json against domains.txt every 45s, for 30 minutes, logging to monitor.log # Run monitoring for 1 hour with alerts dns-benchmark monitoring --use-defaults --interval 30 --duration 3600 \ --alert-latency 150 --alert-failure-rate 5 --output monitor.log
Avg Latency: N/A.
Inline input support for resolvers and domains
This patch introduces full support for commaβseparated inline values for the
--resolvers and --domains flags, fixing issue #39 and improving cli usability without breaking any existing workflows.
New capabilities
- inline resolvers:
--resolvers "1.1.1.1,8.8.8.8,9.9.9.9" - inline domains:
--domains "google.com,github.com" - single values:
--resolvers "1.1.1.1"or--domains "google.com" - named resolvers:
--resolvers "cloudflare,google,quad9" - mixed input:
--resolvers "1.1.1.1,cloudflare,8.8.8.8"
Backward compatibility
- all existing fileβbased configurations continue to work
- no breaking changes to the cli
- file detection takes priority over inline parsing
Usage Examples
Before (Only files worked)
dns-benchmark benchmark \
--resolvers data/resolvers.json \
--domains data/domains.txtAfter (Both work)
# Inline (New) dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8,9.9.9.9" \ --domains "google.com,github.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshooting # Files (STILL WORKS) dns-benchmark benchmark \ --resolvers data/resolvers.json \ --domains data/domains.txt \ --formats csv
Named resolvers
# Named resolvers dns-benchmark benchmark \ --resolvers "Cloudflare,Google,Quad9" \ --domains "google.com,github.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshooting_named
Mixed input
# Mixed input dns-benchmark benchmark \ --resolvers "1.1.1.1,Cloudflare,8.8.8.8" \ --domains "google.com,github.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshooting_mixed
Single
# Single dns-benchmark benchmark \ --resolvers "1.1.1.1" \ --domains "google.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshooting
π§ Utilities
Feedback
# Provide feedback
dns-benchmark feedbackRisolver management
# Show default resolvers and domains dns-benchmark list-defaults # Browse all available resolvers dns-benchmark list-resolvers # Browse with detailed information dns-benchmark list-resolvers --details # Filter by category dns-benchmark list-resolvers --category security dns-benchmark list-resolvers --category privacy dns-benchmark list-resolvers --category family # Export resolvers to different formats dns-benchmark list-resolvers --format csv dns-benchmark list-resolvers --format json
Domain management
# List all test domains dns-benchmark list-domains # Show domains by category dns-benchmark list-domains --category tech dns-benchmark list-domains --category ecommerce dns-benchmark list-domains --category social # Limit results dns-benchmark list-domains --count 10 dns-benchmark list-domains --category news --count 5 # Export domain list dns-benchmark list-domains --format csv dns-benchmark list-domains --format json
Category overview
# View all available categories
dns-benchmark list-categoriesConfiguration management
# Generate sample configuration dns-benchmark generate-config --output sample_config.yaml # Category-specific configurations dns-benchmark generate-config --category security --output security_test.yaml dns-benchmark generate-config --category family --output family_protection.yaml dns-benchmark generate-config --category performance --output performance_test.yaml # Custom configuration for specific use case dns-benchmark generate-config --category privacy --output privacy_audit.yaml
Complete usage guide
Quick performance test
# Basic test with progress bars dns-benchmark benchmark --use-defaults # Quick test with only CSV output dns-benchmark benchmark --use-defaults --formats csv --quiet # Test specific record types dns-benchmark benchmark --use-defaults --record-types A,AAAA,MX
Add-on analytics flags:
# Include domain and record-type analytics and error breakdown
dns-benchmark benchmark --use-defaults \
--domain-stats --record-type-stats --error-breakdownJSON export:
# Export a machine-readable bundle
dns-benchmark benchmark --use-defaults --json --output ./resultsNetwork administrator
# Compare internal vs external DNS dns-benchmark benchmark \ --resolvers "192.168.1.1,1.1.1.1,8.8.8.8,9.9.9.9" \ --domains "internal.company.com,google.com,github.com,api.service.com" \ --formats excel,pdf \ --timeout 3 \ --max-concurrent 50 \ --output ./network_audit # Test DNS failover scenarios dns-benchmark benchmark \ --resolvers data/primary_resolvers.json \ --domains data/business_critical_domains.txt \ --record-types A,AAAA \ --retries 3 \ --formats csv,excel \ --output ./failover_test
ISP & network operator
# Comprehensive ISP resolver comparison dns-benchmark benchmark \ --resolvers data/isp_resolvers.json \ --domains data/popular_domains.txt \ --timeout 5 \ --max-concurrent 100 \ --formats csv,excel,pdf \ --output ./isp_performance_analysis # Regional performance testing dns-benchmark benchmark \ --resolvers data/regional_resolvers.json \ --domains data/regional_domains.txt \ --formats excel \ --quiet \ --output ./regional_analysis
Developer & DevOps
# Test application dependencies dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8" \ --domains "api.github.com,registry.npmjs.org,pypi.org,docker.io,aws.amazon.com" \ --formats csv \ --quiet \ --output ./app_dependencies # CI/CD integration test dns-benchmark benchmark \ --resolvers data/ci_resolvers.json \ --domains data/ci_domains.txt \ --timeout 2 \ --formats csv \ --quiet
Security auditor
# Security-focused resolver testing dns-benchmark benchmark \ --resolvers data/security_resolvers.json \ --domains data/malware_test_domains.txt \ --formats csv,pdf \ --output ./security_audit # Privacy-focused testing dns-benchmark benchmark \ --resolvers data/privacy_resolvers.json \ --domains data/tracking_domains.txt \ --formats excel \ --output ./privacy_analysis
Enterprise IT
# Corporate network assessment dns-benchmark benchmark \ --resolvers data/enterprise_resolvers.json \ --domains data/corporate_domains.txt \ --record-types A,AAAA,MX,TXT,SRV \ --timeout 10 \ --max-concurrent 25 \ --retries 2 \ --formats csv,excel,pdf \ --output ./enterprise_dns_audit # Multi-location testing dns-benchmark benchmark \ --resolvers data/global_resolvers.json \ --domains data/international_domains.txt \ --formats excel \ --output ./global_performance
π README Adjustments for Final Patch
New CLI Options
| Option | Description | Example |
|---|---|---|
--iterations, -i |
Run the full benchmark loop N times | dns-benchmark benchmark --use-defaults -i 3 |
--use-cache |
Allow cached results to be reused across iterations | dns-benchmark benchmark --use-defaults -i 3 --use-cache |
--warmup |
Run a full warmup (all resolvers Γ domains Γ record types) | dns-benchmark benchmark --use-defaults --warmup |
--warmup-fast |
Run a lightweight warmup (one probe per resolver) | dns-benchmark benchmark --use-defaults --warmup-fast |
--include-charts |
Embed charts and graphs in PDF/Excel reports for visual performance analysis | dns-benchmark benchmark --use-defaults --formats pdf,excel --include-charts |
β‘ CLI Commands
The DNS Benchmark Tool now includes three specialized commands for different workflows:
π Top
Quickly rank resolvers by speed and reliability.
# Rank resolvers quickly dns-benchmark top # Use custom domain list dns-benchmark top -d domains.txt # Export results to JSON dns-benchmark top -o results.json
π Compare
Benchmark resolvers sideβbyβside with detailed statistics.
# Compare Cloudflare, Google, and Quad9 dns-benchmark compare Cloudflare Google Quad9 # Compare by IP addresses dns-benchmark compare 1.1.1.1 8.8.8.8 9.9.9.9 # Show detailed per-domain breakdown dns-benchmark compare Cloudflare Google --show-details # Export results to CSV dns-benchmark compare Cloudflare Google -o results.csv
π Monitoring
Continuously monitor resolver performance with alerts.
# Monitor default resolvers continuously (every 60s) dns-benchmark monitoring --use-defaults # Monitor with custom resolvers and domains dns-benchmark monitoring -r resolvers.json -d domains.txt # Run monitoring for 1 hour with alerts dns-benchmark monitoring --use-defaults --interval 30 --duration 3600 \ --alert-latency 150 --alert-failure-rate 5 --output monitor.log
π Command Showcase
| Command | Purpose | Typical Use Case | Key Options | Output |
|---|---|---|---|---|
| top | Quick ranking of resolvers by speed and reliability | Fast check to see which resolver is best right now | --domains, --record-types, --output |
Sorted list of resolvers with latency & success rate |
| compare | Sideβbyβside comparison of specific resolvers | Detailed benchmarking across chosen resolvers/domains | --domains, --record-types, --iterations, --output, --show-details |
Table of resolvers with latency, success rate, perβdomain breakdown |
| monitoring | Continuous monitoring with alerts | Realβtime tracking of resolver performance over time | --interval, --duration, --alert-latency, --alert-failure-rate, --output, --use-defaults |
Live status indicators, alerts, optional log file |
π Analysis Enhancements
- Iteration count: displayed when more than one iteration is run.
- Cache hits: shows how many queries were served from cache (when
--use-cacheis enabled). - Failure tracking: resolvers with repeated errors are counted and can be inspected with
get_failed_resolvers(). - Cache statistics: available via
get_cache_stats(), showing number of cached entries and whether cache is enabled. - Warmup results: warmup queries are marked with
iteration=0in raw data, making them easy to filter out in analysis.
Example summary output:
=== BENCHMARK SUMMARY === Total queries: 150 Successful: 140 (93.33%) Average latency: 212.45 ms Median latency: 198.12 ms Fastest resolver: Cloudflare Slowest resolver: Quad9 Iterations: 3 Cache hits: 40 (26.7%)
β‘ Best Practices
| Mode | Recommended Flags | Purpose |
|---|---|---|
| Quick Run | --iterations 1 --timeout 1 --retries 0 --warmup-fast |
Fast feedback, minimal retries, lightweight warmup. Good for quick checks. |
| Thorough Run | --iterations 3 --use-cache --warmup --timeout 5 --retries 2 |
Multiple passes, cache enabled, full warmup. Best for detailed benchmarking. |
| Debug Mode | --iterations 1 --timeout 10 --retries 0 --quiet |
Long timeout, no retries, minimal output. Useful for diagnosing resolver issues. |
| Balanced Run | --iterations 2 --use-cache --warmup-fast --timeout 2 --retries 1 |
A middle ground: moderate speed, some retries, cache enabled, quick warmup. |
Feedback & Community Input
We value your input! Help us improve dns-benchmark by sharing your experience and DNS challenges.
Feedback Command
Open the feedback form directly from CLI:
This command:
- Opens the feedback survey in your default browser
- Takes ~2 minutes to complete
- Directly shapes our roadmap and priorities
- Automatically marks feedback as given (won't prompt again)
Survey link: https://forms.gle/BJBiyBFvRJHskyR57
Smart Feedback Prompts
To avoid being intrusive, dns-benchmark uses intelligent prompting:
When prompts appear:
- After your 5th, 15th, and 30th benchmark run
- With a 24-hour cooldown between prompts
- Only if you haven't already given feedback
Auto-dismiss conditions:
- You've already submitted feedback
- You've dismissed the prompt 3 times
- You've opted out via environment variable
Example prompt:
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
π’ Quick feedback request
Help shape dns-benchmark! Share your biggest DNS challenge.
β https://forms.gle/BJBiyBFvRJHskyR57 (2 min survey)
β Or run: dns-benchmark feedback
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Show this again? (y/n) [y]:
Privacy & Data Storage
What we store locally:
dns-benchmark stores feedback prompt state in ~/.dns-benchmark/feedback.json
Contents:
{
"total_runs": 15,
"feedback_given": false,
"dismissed_count": 0,
"last_shown": 1699876543,
"version": "1.0"
}Privacy notes:
- β All data stored locally on your machine
- β No telemetry or tracking
- β No automatic data transmission
- β File is only read/written during benchmark runs
- β Safe to delete at any time
What we collect (only when you submit feedback):
- Whatever you choose to share in the survey
- We never collect usage data automatically
Opting Out
Method 1: Dismiss the prompt
When prompted, type n to dismiss:
Show this again? (y/n) [y]: n
β Got it! We won't ask again. Thanks for using dns-benchmark!
After 3 dismissals, prompts stop permanently.
Method 2: Environment variable (complete disable)
# Bash/Zsh export DNS_BENCHMARK_NO_FEEDBACK=1 # Windows PowerShell $env:DNS_BENCHMARK_NO_FEEDBACK="1" # Permanently (add to ~/.bashrc or ~/.zshrc) echo 'export DNS_BENCHMARK_NO_FEEDBACK=1' >> ~/.bashrc
Method 3: Delete state file
rm ~/.dns-benchmark/feedback.jsonMethod 4: CI/CD environments Feedback prompts are automatically disabled when:
CI=trueenvironment variable is set (standard in GitHub Actions, GitLab CI, etc.)--quietflag is used
Reset for testing (developers):
dns-benchmark reset-feedback # Hidden commandβοΈ Configuration Files
Resolvers JSON format
{
"resolvers": [
{
"name": "Cloudflare",
"ip": "1.1.1.1",
"ipv6": "2606:4700:4700::1111"
},
{
"name": "Google DNS",
"ip": "8.8.8.8",
"ipv6": "2001:4860:4860::8888"
}
]
}Domains text file format
# Popular websites google.com github.com stackoverflow.com # Corporate domains microsoft.com apple.com amazon.com # CDN and cloud cloudflare.com aws.amazon.com
Output formats
CSV outputs
- Raw data: individual query results with timestamps and metadata
- Summary statistics: aggregated metrics per resolver
- Domain statistics: per-domain metrics (when --domain-stats)
- Record type statistics: per-record-type metrics (when --record-type-stats)
- Error breakdown: counts by error type (when --error-breakdown)
Excel report
- Raw data sheet: all query results with formatting
- Resolver summary: comprehensive statistics with conditional formatting
- Domain stats: per-domain performance (optional)
- Record type stats: per-record-type performance (optional)
- Error breakdown: aggregated error counts (optional)
- Performance analysis: charts and comparative analysis
PDF report
- Executive summary: key findings and recommendations
- Performance charts: latency comparison; optional success rate chart
- Resolver rankings: ordered by average latency
- Detailed analysis: technical deepβdive with percentiles
π Optional PDF Export
By default, the tool supports CSV and Excel exports.
PDF export requires the extra dependency weasyprint, which is not installed automatically to avoid runtime issues on some platforms.
Install with PDF support
pip install dns-benchmark-tool[pdf]
Usage
Once installed, you can request PDF output via the CLI:
dns-benchmark --use-defaults --formats pdf --output ./results
If weasyprint is not installed and you request PDF output, the CLI will show:
[-] Error during benchmark: PDF export requires 'weasyprint'. Install with: pip install dns-benchmark-tool[pdf]
β οΈ WeasyPrint Setup (for PDF export)
The DNS Benchmark Tool uses WeasyPrint to generate PDF reports.
If you want PDF export, you need extra system libraries in addition to the Python package.
π Linux (Debian/Ubuntu)
sudo apt install python3-pip libpango-1.0-0 libpangoft2-1.0-0 \ libharfbuzz-subset0 libjpeg-dev libopenjp2-7-dev libffi-dev
π macOS (Homebrew)
brew install pango cairo libffi gdk-pixbuf jpeg openjpeg harfbuzz
π Windows
Install GTK+ libraries using one of these methods:
-
MSYS2: Download MSYS2, then run:
pacman -S mingw-w64-x86_64-gtk3 mingw-w64-x86_64-libffi
-
GTK+ 64βbit Installer: Download GTK+ Runtime and run the installer.
Restart your terminal after installation.
β Verify Installation
After installing the system libraries, install the Python extra:
pip install dns-benchmark-tool[pdf]
Then run:
dns-benchmark --use-defaults --formats pdf --output ./results
JSON export
- Machineβreadable bundle including:
- Overall statistics
- Resolver statistics
- Raw query results
- Domain statistics
- Record type statistics
- Error breakdown
Generate Sample Config
dns-benchmark generate-config \ --category privacy \ --output my-config.yaml
Performance optimization
# Large-scale testing (1000+ queries) dns-benchmark benchmark \ --resolvers data/many_resolvers.json \ --domains data/many_domains.txt \ --max-concurrent 50 \ --timeout 3 \ --quiet \ --formats csv # Unstable networks dns-benchmark benchmark \ --resolvers data/backup_resolvers.json \ --domains data/critical_domains.txt \ --timeout 10 \ --retries 3 \ --max-concurrent 10 # Quick diagnostics dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8" \ --domains "google.com,cloudflare.com" \ --formats csv \ --quiet \ --timeout 2
Troubleshooting
# Command not found pip install -e . python -m dns_benchmark.cli --help # PDF generation fails (Ubuntu/Debian) sudo apt-get install libcairo2 libpango-1.0-0 libpangocairo-1.0-0 \ libgdk-pixbuf2.0-0 libffi-dev shared-mime-info # Or skip PDF dns-benchmark benchmark --use-defaults --formats csv,excel # Network timeouts dns-benchmark benchmark --use-defaults --timeout 10 --retries 3 dns-benchmark benchmark --use-defaults --max-concurrent 25
Debug mode
# Verbose run python -m dns_benchmark.cli benchmark --use-defaults --formats csv # Minimal configuration dns-benchmark benchmark --resolvers "1.1.1.1" --domains "google.com" --formats csv
Automation & CI
Cron jobs
# Daily monitoring 0 2 * * * /usr/local/bin/dns-benchmark benchmark --use-defaults --formats csv --quiet --output /var/log/dns_benchmark/daily_$(date +\%Y\%m\%d) # Time-based variability (every 6 hours) 0 */6 * * * /usr/local/bin/dns-benchmark benchmark --use-defaults --formats csv --quiet --output /var/log/dns_benchmark/$(date +\%Y\%m\%d_\%H)
GitHub Actions example
- name: DNS Performance Test run: | pip install dnspython pandas click tqdm colorama dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8" \ --domains "api.service.com,database.service.com" \ --formats csv \ --quiet
Screenshots
Place images in docs/screenshots/:
docs/screenshots/cli_run.pngdocs/screenshots/excel_report.pngdocs/screenshots/pdf_summary.pngdocs/screenshots/pdf_charts.pngdocs/screenshots/excel_charts.pngdocs/screenshots/real_time_monitoring.png
1. CLI Benchmark Run
2. Excel Report Output
3. PDF Executive Summary
4. PDF Charts
5. Excel Charts
6. Real Time Monitoring
Getting help
dns-benchmark --help dns-benchmark benchmark --help dns-benchmark list-resolvers --help dns-benchmark list-domains --help dns-benchmark list-categories --help dns-benchmark generate-config --help
Common scenarios:
# I'm new β where to start? dns-benchmark list-defaults dns-benchmark benchmark --use-defaults # Test specific resolvers dns-benchmark list-resolvers --category security dns-benchmark benchmark --resolvers data/security_resolvers.json --use-defaults # Generate a management report dns-benchmark benchmark --use-defaults --formats excel,pdf \ --domain-stats --record-type-stats --error-breakdown --json \ --output ./management_report
Release workflow
-
Prerequisites
- GPG key configured: run
make gpg-checkto verify. - Branch protection: main requires signed commits and passing CI.
- CI publish: triggered on signed tags matching vX.Y.Z.
- GPG key configured: run
-
Prepare release (signed)
-
Patch/minor/major bump:
make release-patch # or: make release-minor / make release-major- Updates versions.
- Creates or reuses
release/X.Y.Z. - Makes a signed commit and pushes the branch.
-
Open PR: from
release/X.Y.Zintomain, then merge once CI passes.
-
-
Tag and publish
-
Create signed tag and push:
make release-tag VERSION=X.Y.Z
- Tags main with
vX.Y.Z(signed). - CI publishes to PyPI.
- Tags main with
-
-
Manual alternative
-
Create branch and commit signed:
git checkout -b release/manually-update-version-based-on-release-pattern git add . git commit -S -m "Release release/$NEXT_VERSION" git push origin release/$NEXT_VERSION
-
Open PR and merge into main.
-
Then tag:
make release-tag VERSION=$NEXT_VERSION
-
-
Notes
- Signed commits:
git commit -S ... - Signed tags:
git tag -s vX.Y.Z -m "Release vX.Y.Z" - Version sources:
pyproject.tomlandsrc/dns_benchmark/__init__.py
- Signed commits:
π Hosted Version (Coming Soon)
CLI stays free forever. The hosted version adds features impossible to achieve locally:
π Multi-Region Testing
Test from US-East, US-West, EU, Asia simultaneously. See how your DNS performs for users worldwide.
π Historical Tracking
Monitor DNS performance over time. Identify trends, degradation, and optimize continuously.
π¨ Smart Alerts
Get notified via Email, Slack, PagerDuty when DNS performance degrades or SLA thresholds are breached.
π₯ Team Collaboration
Share results, dashboards, and reports across your team. Role-based access control.
π SLA Compliance
Automated monthly reports proving DNS provider meets SLA guarantees. Audit-ready documentation.
π API Access
Integrate DNS monitoring into your existing observability stack. Prometheus, Datadog, Grafana.
Join the Waitlist β | Early access gets 50% off for 3 months
π£οΈ Roadmap
β Current Release (CLI Edition)
- Benchmark DNS resolvers across domains and record types
- Export to CSV, Excel, PDF, JSON
- Statistical analysis (P95, P99, jitter, consistency)
- Automation support (CI/CD, cron)
π§ Hosted Version (Q1 2026)
CLI stays free forever. Hosted adds:
- π Multi-region testing (US, EU, Asia, custom)
- π Historical tracking with charts and trends
- π¨ Alerts (Email, Slack, PagerDuty, webhooks)
- π₯ Team collaboration and sharing
- π SLA compliance reporting
- π API access and integrations
Join Waitlist for early access
π More Network Tools (Q1-Q2 2026)
Part of BuildTools - Network Performance Suite:
- π HTTP/HTTPS Benchmark - Test API endpoints and CDNs
- π SSL Certificate Monitor - Never miss renewals
- π‘ Uptime Monitor - 24/7 availability tracking
- π API Health Dashboard - Complete network observability
π‘ Your Input Matters
Help shape our roadmap:
- π 2-minute feedback survey
- π¬ GitHub Discussions
- β Star us if this helps you!
π€ Contributing
We love contributions! Here's how you can help:
Ways to Contribute
- π Report bugs - Open an issue
- π‘ Suggest features - Start a discussion
- π Improve docs - README, examples, tutorials
- π§ Submit PRs - Bug fixes, features, tests
- β Star the repo - Help others discover the tool
- π’ Spread the word - Tweet, blog, share
π Development & Makefile Commands
This project includes a Makefile to simplify installation, testing, and code quality checks.
.PHONY: install install-dev uninstall mypy black isort flake8 cov test clean cli-test # π§ Install package (runtime only) install: pip install . # π§ Install package with dev extras (pytest, mypy, flake8, black, isort, etc.) install-dev: pip install .[dev] # π§ Uninstall package uninstall: pip uninstall -y dns-benchmark-tool \ dnspython pandas aiohttp click pyfiglet colorama Jinja2 weasyprint openpyxl pyyaml tqdm matplotlib \ mypy black flake8 autopep8 pytest coverage isort mypy: mypy . isort: isort . black: black . flake8: flake8 src tests --ignore=E126,E501,E712,F405,F403,E266,W503 --max-line-length=88 --extend-ignore=E203 cov: coverage erase coverage run --source=src -m pytest -vv -s coverage html test: mypy black isort flake8 cov clean: rm -rf __pycache__ .pytest_cache htmlcov .coverage coverage.xml \ build dist *.egg-info .eggs benchmark_results cli-test: # Run only the CLI smoke tests marked with @pytest.mark.cli pytest -vv -s -m cli tests/test_cli_commands.py
Common usage
-
Install runtime only
-
Install with dev dependencies
-
Run type checks, linting, formatting, and tests
-
Run CLI smoke tests only
-
Clean build/test artifacts
Code Guidelines
- Follow PEP 8 style guide
- Add tests for new features
- Update documentation
- Keep PRs focused and atomic
β FAQ
Why is my ISP's DNS not fastest?
Local ISP DNS often has caching advantages but may lack:
- Global anycast network (slower for distant domains)
- DNSSEC validation
- Privacy features (DoH/DoT)
- Reliability guarantees
Test both and decide based on YOUR priorities!
How often should I benchmark DNS?
- One-time: When choosing DNS provider
- Monthly: For network health checks
- Before migration: When switching providers
- After issues: To troubleshoot performance
Can I test my own DNS server?
Yes! Just add it to a custom resolvers JSON file:
{
"resolvers": [
{"name": "My DNS", "ip": "192.168.1.1"}
]
}What's the difference between CLI and hosted version?
CLI (Free Forever):
- Run tests from YOUR location
- Save results locally
- Manual execution
- Open source
Hosted (Coming Soon):
- Test from MULTIPLE regions
- Historical tracking
- Automated scheduling
- Alerts and integrations
Is this tool safe to use in production?
Yes! The tool only performs DNS lookups (read operations). It does NOT:
- Modify DNS records
- Perform attacks
- Send data to external servers (unless you enable hosted features)
All tests are standard DNS queries that any resolver handles daily.
Why do results vary between runs?
DNS performance varies due to:
- Network conditions
- DNS caching (resolver and intermediate)
- Server load
- Geographic routing changes
Run multiple iterations (--iterations 5) for more consistent results.
π Links & Support
Official
- Website: buildtools.net
- PyPI: dns-benchmark-tool
- GitHub: frankovo/dns-benchmark-tool
Community
- Feedback: 2-minute survey
- Discussions: GitHub Discussions
- Issues: Bug Reports
Stats
- Downloads: 1,400+ (this week)
- Active Users: 600+
License
This project is licensed under the MIT License β see the LICENSE file for details.





