Skip to main content

MCP Server Setup

Sample Implementation Only

This is a sample use case demonstrating how to wrap the AlphaSense Agent API with MCP. It is not a production server and is not hosted by AlphaSense. The implementation below is intended as a reference for running a locally hosted MCP server on your own machine.


What is MCP?

The Model Context Protocol (MCP) is an open standard for connecting AI models to external data sources and tools. Instead of copying data into prompts manually, MCP lets an AI model discover available tools, understand their schemas, and call them on the fly. The result is a seamless, structured bridge between large language models and the systems they need to query.


Why MCP?

With a running MCP server your assistant can:

  • Run GenSearch queries and receive cited, markdown-formatted answers — all without leaving your editor or chat interface.

Installation

Install the required Python packages:

pip install mcp requests

Then set the following environment variables (or add them to a .env file):

.env
ALPHASENSE_API_KEY=your_api_key_here
ALPHASENSE_CLIENT_ID=your_client_id_here
ALPHASENSE_CLIENT_SECRET=your_client_secret_here
ALPHASENSE_EMAIL=your_email_here
ALPHASENSE_PASSWORD=your_password_here
Don't have credentials?

Contact your account team at support@alpha-sense.com to request API access.


Available tools

The MCP server exposes one tool. The tool description and its full JSON Schema input are listed below.

Description: AI-powered research using GenSearch. Returns markdown with [[N • Source]] citations.

{
"name": "alphasense_search",
"description": "AI-powered research using GenSearch. Returns markdown with [[N \u2022 Source]] citations.",
"inputSchema": {
"type": "object",
"properties": {
"prompt": {
"type": "string",
"description": "The natural-language research question to send to GenSearch."
},
"mode": {
"type": "string",
"enum": ["auto", "fast", "thinkLonger", "deepResearch"],
"default": "auto",
"description": "GenSearch mode: auto (~30-90s, 10 credits, recommended), fast (~30s, 10 credits), thinkLonger (~60-90s, 25 credits), or deepResearch (~12-15min, 100 credits)."
}
},
"required": ["prompt"]
}
}

Server Implementation

Below is a complete, runnable MCP server script. Save it as alphasense_mcp_server.py and point your client configuration to this file.

alphasense_mcp_server.py
#!/usr/bin/env python3
"""AlphaSense MCP Server — exposes GenSearch as an MCP tool."""

import os
import time
import json
import requests
from mcp.server.fastmcp import FastMCP

# ---------------------------------------------------------------------------
# Configuration
# ---------------------------------------------------------------------------
API_KEY = os.environ["ALPHASENSE_API_KEY"]
CLIENT_ID = os.environ["ALPHASENSE_CLIENT_ID"]
CLIENT_SECRET = os.environ["ALPHASENSE_CLIENT_SECRET"]
EMAIL = os.environ["ALPHASENSE_EMAIL"]
PASSWORD = os.environ["ALPHASENSE_PASSWORD"]

AUTH_URL = "https://api.alpha-sense.com/auth"
GRAPHQL_URL = "https://api.alpha-sense.com/gql"

# ---------------------------------------------------------------------------
# Token caching
# ---------------------------------------------------------------------------
_token_cache = {"token": None, "expires_at": 0}


def get_access_token() -> str:
"""Return a valid access token, refreshing if necessary."""
if _token_cache["token"] and time.time() < _token_cache["expires_at"]:
return _token_cache["token"]

response = requests.post(
AUTH_URL,
headers={
"x-api-key": API_KEY,
"Content-Type": "application/x-www-form-urlencoded",
},
data={
"grant_type": "password",
"username": EMAIL,
"password": PASSWORD,
"client_id": CLIENT_ID,
"client_secret": CLIENT_SECRET,
},
)
response.raise_for_status()
data = response.json()
_token_cache["token"] = data["access_token"]
_token_cache["expires_at"] = time.time() + data["expires_in"] - 300
return _token_cache["token"]


def gql_headers() -> dict:
"""Return headers for authenticated GraphQL requests."""
return {
"x-api-key": API_KEY,
"clientid": CLIENT_ID,
"Authorization": f"Bearer {get_access_token()}",
"Content-Type": "application/json",
}


# ---------------------------------------------------------------------------
# MCP Server
# ---------------------------------------------------------------------------
mcp = FastMCP("AlphaSense")

MUTATIONS = {
"auto": "mutation M($input: GenSearchInput!) { genSearch { auto(input: $input) { id } } }",
"fast": "mutation M($input: GenSearchInput!) { genSearch { fast(input: $input) { id } } }",
"thinkLonger": "mutation M($input: GenSearchInput!) { genSearch { thinkLonger(input: $input) { id } } }",
"deepResearch": "mutation M($input: GenSearchInput!) { genSearch { deepResearch(input: $input) { id } } }",
}

POLL_INTERVALS = {"auto": 4, "fast": 3, "thinkLonger": 5, "deepResearch": 10}
TIMEOUTS = {"auto": 180, "fast": 120, "thinkLonger": 180, "deepResearch": 1800}


@mcp.tool()
def alphasense_search(prompt: str, mode: str = "auto") -> str:
"""AI-powered research using GenSearch. Returns markdown with [[N • Source]] citations.

Args:
prompt: The natural-language research question to send to GenSearch.
mode: GenSearch mode — "auto" (~30-90s, recommended), "fast" (~30s), "thinkLonger" (~60-90s), or "deepResearch" (~12-15min).
"""
if mode not in MUTATIONS:
return f"Error: invalid mode '{mode}'. Use auto, fast, thinkLonger, or deepResearch."

# Initiate search
resp = requests.post(
GRAPHQL_URL,
headers=gql_headers(),
json={"query": MUTATIONS[mode], "variables": {"input": {"prompt": prompt}}},
)
resp.raise_for_status()
conversation_id = resp.json()["data"]["genSearch"][mode]["id"]

# Poll for results
poll_query = """
query Q($conversationId: String!) {
genSearch {
conversation(id: $conversationId) {
id markdown progress error { code }
}
}
}
"""
start = time.time()
interval = POLL_INTERVALS[mode]
timeout = TIMEOUTS[mode]

while time.time() - start < timeout:
poll_resp = requests.post(
GRAPHQL_URL,
headers=gql_headers(),
json={"query": poll_query, "variables": {"conversationId": conversation_id}},
)
poll_resp.raise_for_status()
data = poll_resp.json()["data"]["genSearch"]["conversation"]

if data.get("error"):
return f"GenSearch error: {data['error']['code']}"

if data.get("progress", 0) >= 1.0:
return data["markdown"]

time.sleep(interval)

return "Error: GenSearch timed out."


if __name__ == "__main__":
mcp.run()

Bridge to other Agent API features

The MCP server is one way to access the AlphaSense Agent API. For more advanced workflows, explore:

  • GenSearch -- the underlying AI-powered research engine that powers the alphasense_search tool.