Skip to content

brekkylab/ailoy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ailoy

Comprehensive library for building intelligent AI agents

PyPI npm node npm web

Documentation Documentation Discord X


🚀 Quick Start

See how easy to use Ailoy through below examples.

Get your agent just in a single line of code

Check out the simplest python example to build your agent with local models.

pip install ailoy-py
import ailoy as ai

# Create an agent with a local model in a single line of code.
agent = ai.Agent(ai.LangModel.new_local_sync("Qwen/Qwen3-8B"))

# Get the response from the agent simply by calling the `run` method.
response = agent.run("Explain quantum computing in one sentence")
print(response.contents[0].text)

Easy to integrate LLM APIs

Here's the simple javascript example with LLM APIs.

npm install ailoy-node
import * as ai from "ailoy-node";

async function main() {
  const lm = await ai.LangModel.newStreamAPI(
    "OpenAI", // spec
    "gpt-5", // modelName
    "YOUR_OPENAI_API_KEY" // apiKey
  );
  const agent = new ai.Agent(lm);
  for await (const resp of agent.run("Please give me a short poem about AI")) {
    if (resp.message.contents[0].type === "text") {
      console.log(resp.message.contents[0].text);
    }
  }
}

main().catch((err) => {
  console.error("Error:", err);
});

Browser-Native AI (WebAssembly)

You can build your agent entirely in the browser using WebAssembly just in a few lines of code.

npm install ailoy-web
import * as ai from "ailoy-web";

// Check WebGPU support
const { supported } = await ai.isWebGPUSupported();

// Run AI entirely in the browser - no server needed!
const agent = new ai.Agent(await ai.LangModel.newLocal("Qwen/Qwen3-0.6B"));

Quick-customizable Web Agent UI Template

Just Clone to build your own web agent in minutes.


🔥 Key Features

Simple Framework and Powerful Features for AI Agents

  • No boilerplate, no complex setup
  • Reasoning: Extend thinking effortlessly
  • Multi-Modal Inputs: Process both text and images
  • Extensible Tool Calling: User-defined functions and Model Context Protocol (MCP) tools
  • Retrieval-Augmented Generation (RAG): Integrates external knowledge bases without boilerplate

Cross-Platform & Multi-Language APIs

  • Provide Python and JavaScript APIs

  • Support Windows, Linux, and macOS

  • Support Synchronous and Asynchronous APIs

Support Web-browser Native AI (WebAssembly)

  • Run AI entirely in the browser - no server needed!

Flexible Model Adoption

  • Supports both local AI execution and cloud AI providers
  • Effortlessly switch between open-source and AI services
  • Minimal software dependencies — deploy anywhere, from cloud to edge

Rust-Powered

  • Fast, memory-safe, minimal dependencies
  • Best choice for edge computing and low-resource devices

Documentation & Community


Example Projects

Project Description
Gradio Chatbot Web UI chatbot with tool integration
Web Assistant Browser-based AI assistant (WASM)
RAG Electron App Desktop app with document Q&A
MCP Integration GitHub & Playwright tools via MCP

Installation

Warning

Ailoy is under active development. APIs may change with version updates.

Python

pip install ailoy-py

Node.js

npm install ailoy-node

Browser (WebAssembly)

npm install ailoy-web

Support Specifications

Supported AI Models

Type Provider & Models
Local Model Qwen3 (0.6B, 1.7B, 4B, 8B, 14B, 32B, 30B-A3B)
Cloud API OpenAI (GPT)
Cloud API Anthropic (Claude)
Cloud API Google (Gemini)
Cloud API xAI (Grok)

Supported Languags

Language Version
Python 3.10+
JavaScript ES5+, Node.js 20+

Supported Platforms

Supported Platform System Requirements (for Local AI)
Windows Vulkan 1.4 compatible GPU
Linux Vulkan 1.4 compatible GPU
macOS Apple Silicon with Metal
Web Browser WebGPU with shader-f16 support