Files
llm-council/backend/config.py
Krishna Kumar 153dcff69d Add MCP server for council integration
- Add mcp_server package with 7 tools proxying to FastAPI:
  - council_query (full 3-stage process)
  - council_stage1_collect, stage2_rank, stage3_synthesize
  - council_conversation_create, list, get
- Add individual stage endpoints to FastAPI (/api/council/stage1, stage2, stage3)
- Update council models to use valid OpenRouter identifiers
- Add mcp>=1.0.0 dependency
2025-12-16 12:54:29 -06:00

27 lines
634 B
Python

"""Configuration for the LLM Council."""
import os
from dotenv import load_dotenv
load_dotenv()
# OpenRouter API key
OPENROUTER_API_KEY = os.getenv("OPENROUTER_API_KEY")
# Council members - list of OpenRouter model identifiers
COUNCIL_MODELS = [
"openai/gpt-4o",
"google/gemini-3-pro-preview",
"anthropic/claude-sonnet-4.5",
"x-ai/grok-4.1-fast",
]
# Chairman model - synthesizes final response
CHAIRMAN_MODEL = "google/gemini-3-pro-preview"
# OpenRouter API endpoint
OPENROUTER_API_URL = "https://openrouter.ai/api/v1/chat/completions"
# Data directory for conversation storage
DATA_DIR = "data/conversations"