Comprehensive library for building intelligent AI agents
🚀 Quick Start
See how easy to use Ailoy through below examples.
Get your agent just in a single line of code
Check out the simplest python example to build your agent with local models.
import ailoy as ai # Create an agent with a local model in a single line of code. agent = ai.Agent(ai.LangModel.new_local_sync("Qwen/Qwen3-8B")) # Get the response from the agent simply by calling the `run` method. response = agent.run("Explain quantum computing in one sentence") print(response.contents[0].text)
Easy to integrate LLM APIs
Here's the simple javascript example with LLM APIs.
import * as ai from "ailoy-node"; async function main() { const lm = await ai.LangModel.newStreamAPI( "OpenAI", // spec "gpt-5", // modelName "YOUR_OPENAI_API_KEY" // apiKey ); const agent = new ai.Agent(lm); for await (const resp of agent.run("Please give me a short poem about AI")) { if (resp.message.contents[0].type === "text") { console.log(resp.message.contents[0].text); } } } main().catch((err) => { console.error("Error:", err); });
Browser-Native AI (WebAssembly)
You can build your agent entirely in the browser using WebAssembly just in a few lines of code.
import * as ai from "ailoy-web"; // Check WebGPU support const { supported } = await ai.isWebGPUSupported(); // Run AI entirely in the browser - no server needed! const agent = new ai.Agent(await ai.LangModel.newLocal("Qwen/Qwen3-0.6B"));
Quick-customizable Web Agent UI Template
Just Clone to build your own web agent in minutes.
🔥 Key Features
Simple Framework and Powerful Features for AI Agents
- No boilerplate, no complex setup
- Reasoning: Extend thinking effortlessly
- Multi-Modal Inputs: Process both text and images
- Extensible Tool Calling: User-defined functions and Model Context Protocol (MCP) tools
- Retrieval-Augmented Generation (RAG): Integrates external knowledge bases without boilerplate
Cross-Platform & Multi-Language APIs
-
Support Synchronous and Asynchronous APIs
Support Web-browser Native AI (WebAssembly)
- Run AI entirely in the browser - no server needed!
Flexible Model Adoption
- Supports both local AI execution and cloud AI providers
- Effortlessly switch between open-source and AI services
- Minimal software dependencies — deploy anywhere, from cloud to edge
- Fast, memory-safe, minimal dependencies
- Best choice for edge computing and low-resource devices
Documentation & Community
-
Discord Community - Join to ask questions, share your projects, and get help.
Example Projects
| Project | Description |
|---|---|
| Gradio Chatbot | Web UI chatbot with tool integration |
| Web Assistant | Browser-based AI assistant (WASM) |
| RAG Electron App | Desktop app with document Q&A |
| MCP Integration | GitHub & Playwright tools via MCP |
Installation
Warning
Ailoy is under active development. APIs may change with version updates.
Python
Node.js
Browser (WebAssembly)
Support Specifications
Supported AI Models
| Type | Provider & Models |
|---|---|
| Local Model | |
| Cloud API | |
| Cloud API | |
| Cloud API | |
| Cloud API |
Supported Languags
| Language | Version |
|---|---|
| Python | 3.10+ |
| JavaScript | ES5+, Node.js 20+ |
Supported Platforms
| Supported Platform | System Requirements (for Local AI) |
|---|---|
| Windows | Vulkan 1.4 compatible GPU |
| Linux | Vulkan 1.4 compatible GPU |
| macOS | Apple Silicon with Metal |
| Web Browser | WebGPU with shader-f16 support |