Sandboxed AI Skill Platform
Luminarys runs AI skills in isolated sandboxes, scales them across clustered nodes, and exposes them via MCP, ACP, or the built-in autonomous agent.
Write skills in any language that compiles to WebAssembly. Deploy on any architecture — from cloud to IoT.
Sandboxed skill execution
Every skill runs in its own isolated sandbox with fine-grained permissions. File access, network, shell commands — everything is controlled by a deployment manifest. A compromised skill cannot reach the host or other skills.
Run anywhere — cloud to IoT
Skills compile once and run on any platform: Linux, macOS, Windows — on x86, ARM, RISC-V, MIPS. From cloud servers to Raspberry Pi, industrial controllers, and embedded gateways. No recompilation needed.
Multi-node clustering
Deploy skills across multiple nodes connected via NATS. Clients see one MCP server, but skills execute on the node that has them. File transfer between nodes is built in. Add capacity by adding nodes.
Any language, one ABI
Write skills in any language that compiles to WebAssembly. SDKs are available for AssemblyScript, Go, and Rust today. The same ABI works across all languages — skills from different SDKs run side by side.
MCP & ACP protocols
Full MCP support (Streamable HTTP, SSE, stdio). Connect Claude Desktop, Cursor, Qwen CLI, or any MCP-compatible client. Skills appear as typed tools automatically. ACP support is planned.
Built-in AI agent
An autonomous agent that orchestrates skills to complete complex tasks: writes code, runs tests, deploys results — all through the same skill infrastructure. Currently in development.
How it works
Get started
Download the release, extract, and run. The server starts with a demo skill out of the box — connect your MCP client and start building.