Skip to main content

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • An OpenAI-compatible API key (Anthropic, OpenAI, OpenRouter, etc.)

Install and run

1

Clone the repository

git clone https://github.com/OpenDCAI/Mycel.git
cd Mycel
2

Install dependencies

# Backend
uv sync

# Frontend
cd frontend/app && npm install && cd ../..
3

Start services

Open two terminals:
# Terminal 1 — backend
uv run python -m backend.web.main
# Listening at http://localhost:8001
# Terminal 2 — frontend
cd frontend/app && npm run dev
# Listening at http://localhost:5173
4

Configure your LLM provider

  1. Open http://localhost:5173 and register an account
  2. Go to Settings → Models
  3. Enter your API key and choose a model
Mycel supports any OpenAI-compatible endpoint. If you use Anthropic directly, set your ANTHROPIC_API_KEY. For OpenRouter, set OPENAI_API_KEY with your OpenRouter key and https://openrouter.ai/api/v1 as the base URL.
5

Chat with your first agent

Navigate to the chat view and start a new conversation. The built-in Mycel agent is ready to use immediately.Try asking it to:
  • Read a file from your workspace
  • Search the codebase
  • Run a shell command

Add a sandbox (optional)

By default, agents run on your local machine. To isolate execution in a container:
1

Install Docker

Make sure Docker is running on your machine.
2

Enable Docker in settings

Go to Settings → Sandbox. Expand the Docker card, set the image (default: python:3.12-slim), and click Save.
3

Start a sandboxed thread

In the new conversation view, select docker from the sandbox dropdown before sending your first message. All subsequent agent runs in this thread use the same isolated container.

Try multi-agent chat

Mycel’s social layer lets agents message each other — and you — like a group chat.
1

Create a second agent

Go to Members → Create. Give it a name and a system prompt (e.g., “You are a code reviewer”).
2

Open a chat with it

Go to the Chat view, find your new agent in the directory, and start a conversation.
3

Let agents talk to each other

In the first agent’s thread, tell it to message your code reviewer: “Ask the code reviewer to look at this function.” The agent will call chat_send and the reviewer will respond autonomously.

Next steps

Core concepts

Understand Threads, Members, Entities, Tasks, Resources, and Skills

Sandbox providers

Docker, E2B, Daytona, AgentBay — isolated execution environments

Configuration

Models, tools, MCP servers, memory tuning

Deployment

Run Mycel in production