Settings

Theme

Show HN: ThinkReview open source browser Copilot GitLab and ADO PRs(Ollama)

github.com

1 points by jkshenawy22 5 months ago · 0 comments · 3 min read

Reader

Over the last few months I’ve been building a lightweight, open-source “copilot” that runs directly in your browser and helps you review Pull Requests / Merge Requests without sending code to any external service unless you choose to.

ThinkReview just gained two major updates:

1. *It is now fully open source* 2. *It now supports Ollama, so you can run LLMs locally for private code reviews*

---

## What is ThinkReview?

ThinkReview is a browser extension that attaches itself to the native PR/MR UI in:

- GitLab (self-hosted or SaaS) - Azure DevOps - GitHub - Bitbucket

Instead of acting like CodeRabbit, CodeAnt, or CI-connected bots that auto-comment on your PRs, ThinkReview does something different:

- It gives you a *private chat window* attached to the diff view. - You can ask the model questions about the MR, explore logic, identify potential bugs, or generate draft comments. - The AI doesn’t post anything automatically — you stay in control.

This works well for developers who still do most of their reviewing in the browser and don’t want a noisy bot writing public comments.

---

## Demo (GIF)

<img src="https://firebasestorage.googleapis.com/v0/b/thinkgpt.firebas..." width="600" />

---

## Why open source?

A lot of early users (especially from companies running self-hosted GitLab) asked for:

- transparency around where code goes - the ability to self-audit - control over model choice - contributions and community fixes

Repo: https://github.com/Thinkode/thinkreview-browser-extension

It’s built with standard browser APIs and a small LLM integration layer, so it’s very hackable.

---

## Ollama Support (local LLMs)

As of v1.4.0, you can point ThinkReview to your local Ollama instance. This lets you run any supported model:

- Qwen Coder - Llama 3 - DeepSeek - Codestral - Any other Ollama model

### Why this matters

- Your code *never leaves your machine* - Zero cost - Works cleanly with self-hosted GitLab / air-gapped setups - No API keys, no vendor lock-in - You can swap models instantly

If you prefer speed, you can still use cloud LLMs; if you prefer privacy, Ollama works surprisingly well (tested ~50s on a Mac Mini M4).

---

## Installation

Works on any Chromium browser:

Chrome Web Store: https://chromewebstore.google.com/detail/thinkreview-ai-code...

No backend server required. Configuration is minimal.

---

## Looking for feedback

HN has a lot of people who:

- review PRs daily - care about developer ergonomics - run self-hosted GitLab - have thoughts about local vs cloud LLM workflows - like hacking on browser extensions

If you try it, I’d love feedback on:

- UI/UX improvements - additional provider integrations - other platforms to support - performance / caching ideas for local LLMs

Discussions / issues: https://github.com/Thinkode/thinkreview-browser-extension or https://thinkreview.dev/contact

Thanks for reading, *Jay*

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection