Add OpenAI Deep Research MCP server
- FastMCP server with deep_research and deep_research_info tools - OpenAI Responses API integration with background polling - Configurable model via DEEP_RESEARCH_MODEL env var - Default: o4-mini-deep-research (faster/cheaper) - Optional FastAPI backend for standalone use - Tested successfully: 80s query, 20 web searches, 4 citations
This commit is contained in:
136
README.md
Normal file
136
README.md
Normal file
@@ -0,0 +1,136 @@
|
||||
# MCP Deep Research
|
||||
|
||||
MCP Server for OpenAI Deep Research API - comprehensive web research with citations.
|
||||
|
||||
## Overview
|
||||
|
||||
This MCP server provides access to OpenAI's Deep Research models, which can:
|
||||
- Perform extensive web searches
|
||||
- Analyze data with code execution
|
||||
- Synthesize findings into structured reports
|
||||
- Provide citations for all sources
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
cd mcp-deep-research
|
||||
uv sync
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
| Variable | Required | Default | Description |
|
||||
|----------|----------|---------|-------------|
|
||||
| `OPENAI_API_KEY` | Yes | - | Your OpenAI API key |
|
||||
| `DEEP_RESEARCH_MODEL` | No | `o4-mini-deep-research-2025-06-26` | Research model to use |
|
||||
| `DEEP_RESEARCH_POLL_INTERVAL` | No | `5.0` | Seconds between status polls |
|
||||
|
||||
### Available Models
|
||||
|
||||
- `o4-mini-deep-research-2025-06-26` - Faster, cheaper (DEFAULT)
|
||||
- `o3-deep-research-2025-06-26` - More thorough, ~$1+ per query
|
||||
|
||||
## Usage
|
||||
|
||||
### As MCP Server (stdio)
|
||||
|
||||
```bash
|
||||
OPENAI_API_KEY=your-key uv run python -m mcp_server.server
|
||||
```
|
||||
|
||||
### Standalone FastAPI (optional)
|
||||
|
||||
```bash
|
||||
OPENAI_API_KEY=your-key uv run python -m backend.main
|
||||
```
|
||||
|
||||
Runs on `http://localhost:8002` by default.
|
||||
|
||||
## MCP Tools
|
||||
|
||||
### `deep_research`
|
||||
|
||||
Performs comprehensive web research on a query.
|
||||
|
||||
**Parameters:**
|
||||
- `query` (required): The research question or topic
|
||||
- `system_prompt` (optional): Instructions to guide research focus
|
||||
- `include_code_analysis` (optional, default: true): Allow code execution for data analysis
|
||||
- `max_wait_minutes` (optional, default: 15): Maximum time to wait
|
||||
|
||||
**Returns:**
|
||||
```json
|
||||
{
|
||||
"status": "completed",
|
||||
"model": "o4-mini-deep-research-2025-06-26",
|
||||
"report_text": "# Research Report\n\n...",
|
||||
"citations": [
|
||||
{"title": "Source Title", "url": "https://..."}
|
||||
],
|
||||
"web_searches": 12,
|
||||
"code_executions": 2,
|
||||
"elapsed_time": 180.5
|
||||
}
|
||||
```
|
||||
|
||||
### `deep_research_info`
|
||||
|
||||
Returns configuration information about the deep research setup.
|
||||
|
||||
## Integration with CouncilApp
|
||||
|
||||
The server is configured in `councilapp.backend/packages/server/src/server/session.ts`:
|
||||
|
||||
```typescript
|
||||
"deep-research": {
|
||||
command: "/bin/bash",
|
||||
args: ["-c", `cd ${MCP_DEEP_RESEARCH_PATH} && uv run python -m mcp_server.server`],
|
||||
env: {
|
||||
OPENAI_API_KEY: process.env.OPENAI_API_KEY,
|
||||
DEEP_RESEARCH_MODEL: process.env.DEEP_RESEARCH_MODEL,
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
## Docker Configuration
|
||||
|
||||
Set these in your Docker environment or docker-compose.yml:
|
||||
|
||||
```yaml
|
||||
environment:
|
||||
- OPENAI_API_KEY=sk-...
|
||||
- DEEP_RESEARCH_MODEL=o4-mini-deep-research-2025-06-26 # or o3-deep-research-2025-06-26
|
||||
```
|
||||
|
||||
## Pricing
|
||||
|
||||
Deep research costs vary based on:
|
||||
- Number of web searches performed
|
||||
- Code interpreter usage
|
||||
- Token consumption
|
||||
|
||||
Approximate costs:
|
||||
- `o4-mini`: Lower cost, faster responses
|
||||
- `o3`: ~$1+ per complex query with many web searches
|
||||
|
||||
## Test Results (2024-12-30)
|
||||
|
||||
Successfully tested with query: "What is the current population of Tokyo in 2024?"
|
||||
|
||||
```
|
||||
Status: completed
|
||||
Model: o4-mini-deep-research-2025-06-26
|
||||
Elapsed time: 80.5s
|
||||
Web searches: 20
|
||||
Citations: 4
|
||||
|
||||
Report excerpt:
|
||||
# Tokyo Population (2024)
|
||||
As of late 2024, the official population of Tokyo Metropolis is about 14.2 million people.
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
Reference in New Issue
Block a user