Local MCP Server
Run the Obris MCP server on your machine using an API key. This connects your AI tools to your Obris topics over stdio without going through the hosted server.
This is an alternative to the remote MCP connectors which use OAuth and connect directly to mcp.obris.ai. The local server is useful when you want to run the server yourself or when your AI tool doesn't support remote MCP connections.
Before you start
To set up the local Obris MCP server, you'll need:
- An Obris account
- An API key
- uv installed
- A supported AI platform: Claude Desktop, Claude Code, or Gemini CLI
Set your API key
Add your Obris API key to your shell profile so it's available in every terminal session. Add the following line to your ~/.zshrc, ~/.zprofile, or ~/.bash_profile:
export OBRIS_API_KEY=your_api_key_here
Replace your_api_key_here with your Obris API key. Then reload your shell:
source ~/.zshrc
ChatGPT does not support local stdio MCP servers. Use the remote MCP connector or a Custom GPT instead.
Connect your AI platform
Select your AI platform below to see the setup instructions.
Connect via the Claude Desktop app config.
- In Claude Desktop, go to Settings > Developer and click Edit Config to open
claude_desktop_config.json - Run
which uvxin your terminal to get the full path touvx - Add the Obris server to your config:
{ "mcpServers": { "obris": { "command": "/full/path/to/uvx", "args": ["obris-mcp"], "env": { "OBRIS_API_KEY": "$OBRIS_API_KEY" } } } } - The config above reads
OBRIS_API_KEYfrom your shell environment (see prerequisites) - Restart Claude Desktop
Check the connection
Once connected, verify that Obris is working by asking your AI assistant a question about your topics. Try a prompt like:
What topics do I have in Obris?
Your assistant should be able to list your Obris topics and the knowledge saved in them.
AI tools decide when to invoke MCP tools. If your assistant doesn't use your Obris data, try mentioning Obris explicitly, for example "Using Obris, find my knowledge about..."