Press enter or click to view image in full size
Roo Code is a powerful Visual Studio Code extension that integrates large language models (LLMs) directly into your development environment, enabling you to code, debug, and ideate with AI in real-time. While there are several popular alternatives such as Github Co-Pilot, Cline, Cursor and Windsurf, this guide focuses on setting up Roo Code. We will walk you through how to set up Roo Code using LLMs from OpenRouter, and Google AI Studio so you can harness AI-powered development at no cost. Paid models will perform better, but let’s be honest, who doesn’t like free?
Step 1: Install Roo Code from the VS Code Marketplace
Getting started is simple:
- Open Visual Studio Code.
- Go to the Extensions sidebar (Ctrl+Shift+X or Cmd+Shift+X).
- Search for “Roo Code”.
- Click Install.
Once installed, Roo Code will add an AI assistant interface to VS Code, accessible from the sidebar (look for the kangaroo icon) or via the command palette.
Step 2: Setup OpenRouter as your AI provider
Roo Code is provider-agnostic and works well with custom endpoints. OpenRouter is a great option because it offers access to a wide variety of models, many of which are free.
Here’s how to connect Roo Code with OpenRouter:
- Go to https://openrouter.ai and create a free account.
- After signing in, navigate to your API Keys page and generate a new API key.
- In VS Code (see image below):
a. Open the Roo Code extension by clicking the kangoroo icon on the left.
b. Click on the gear icon next to the Roo Code extension.
c. Select Providers on the left - Find the API Provider dropdown, and select OpenRouter
- Paste your API key in the OpenRouter API Key field.
- For the Model dropdown, this is where you select the model you would like to use. Skip to Step 4 in this guide to see what model to use.
You’re now all setup! Roo Code will begin sending prompts to OpenRouter and return results from the models you choose.
Step 3: Setup Google AI Studio (optional)
OpenRouter has a free version of Gemini 2.5 Flash, but the rate limiting is aggressive. For more relaxed rate limits, it’s still possible to get free access to this model through the Google AI Studio. Here are the steps to get an API key:
- Go to Google AI Studio
- Sign in with your google account
- In the modal popup, press the “Get API key” button
- After acknowledging the terms of service, click on the “Create API key” button on the top right.
- Copy the generated key into your Roo Code API provider settings.
Step 4: Choosing your LLMs — Recommended free models
OpenRouter gives you access to a wide range of LLMs. Here are some excellent, free to use models we recommend for different use cases:
⫸ DeepSeek R1
Use case: Logical reasoning, summarization, and problem solving.
Model name: deepseek-r1–0528-free
⫸ DeepSeek V3
Use case: Balanced general-purpose model.
Model name: deepseek-chat-v3–0324:free
⫸ Gemini 2.5 Flash
Use case: Google’s fast hybrid reasoning model with a large context window.
Model name: gemini-2.5-flash-preview-05-20
⫸ Mistral Devstral Small
Use Case: Lightweight, optimized for fast responses.
Model name: devstral-small:free
The best way to manage all these LLMs in Roo Code is to create a new Configuration Profile in the Providers tab. This allows for quick switching of models if one of them isn’t up for the task.
All these models are ideal for different types of developer workflows, some are better at code generation, while others are better at architecting your application. Here is a full list of free LLMs on OpenRouter.
Step 5: Recommended LLMs for Roo Code Modes
Roo Code sets itself apart from other code assistants with its multi-mode configuration. You can assign different models to different modes, and here is our recommendation based on the available free models:
- Architect Mode: Use a strong reasoning model such as Gemini 2.5 Flash Preview or Deepseek R1
- Code Mode: Try DeepSeek V3 for code generation or Mistral Devstral Small for smaller tasks.
- Ask Mode: Use Gemini 2.5 Flash Preview for ideation, and to create an implementation plan.
- Debug Mode: DeepSeek V3 is ideal for breaking down code or fixing bugs.
- Orchestrator Mode: Need a strong reasoning model that can understand complex tasks, and break them down. Use Gemini 2.5 Flash Preview or Deepseek R1.
You can configure the model for each mode in Roo Code’s extension preferences (gear icon). Just match each mode to your preferred model name or identifier.