Show HN: Collate – Offline PDF Summarizer Using Llama 3.2
Hi HN!
I’m excited to share Collate, a macOS app I built that lets you chat with PDFs, generate summaries, and read documents—all offline. It runs entirely on-device using Llama 3.2 quantized models, ensuring privacy without compromising on capability.
Features: • Ask Questions: Interact with PDFs to extract key insights or answers without manual searching.
• Summarize: Generate concise summaries from long documents in seconds.
• Privacy-First: Built with on-device processing—your data never leaves your machine.
• Offline-Ready: Works without an internet connection, powered by efficient quantized models.
Under the Hood:
• Runs Llama 3.2 quantized modelels.
• Optimized for macOS to provide a seamless offline experience.
The app is free and available on the macOS App Store.
Demo: https://collate.one/get-started#features
Download: https://apps.apple.com/us/app/collateai/id6447429913
I’d love your thoughts on features, performance, or ideas to make this better. Thanks for taking a look!
Let me know if you’d like further refinements!
No comments yet.