Enterpret's MCP Server Is Live: Connect Customer Feedback to Claude, Cursor, and ChatGPT
If you've tried using Claude or Cursor to answer a product question, you've probably hit the same wall: it can reason about your code, your docs, your architecture, but it has no idea what your customers actually said last quarter.
The Enterpret MCP server fixes that. It's live today.
What is Model Context Protocol?
MCP (Model Context Protocol) is an open standard Anthropic built to let AI tools connect to external data. Before it, every integration was a one-off build. Now you write one server, and any compatible client, Claude, Cursor, ChatGPT, Glean, n8n, and others, can call it.
Adoption has moved surprisingly fast. Anthropic launched an official MCP Gallery so Claude users can connect approved servers in one click. OpenAI added support earlier this year. Cursor, Glean, and n8n followed. At this point the question for any team building on AI isn't whether their tools will support MCP, it's which data they're connecting through it.
Our view: start with customer feedback. It's not the flashiest integration to announce. It's just the one most likely to change the answer to something your team is actually arguing about.
The Enterpret MCP Server
It connects any MCP-compatible AI client to your Customer Context Graph, Enterpret's structured model of what your customers are saying across support tickets, NPS responses, app store reviews, Gong transcripts, Salesforce notes, and more.
Four tools, callable by any AI agent:
search_knowledge_graph— find feedback by topic, theme, or product areaexecute_cypher_query— run structured queries across your full datasetget_schema— understand your feedback taxonomyget_organization_details— initialize a session with your org's context
Every response includes citations linking back to source feedback. These work mid-conversation, inside agent pipelines, or as steps in automated workflows, no separate context switch.
To initialize: type /enterpret in your AI tool and select Initialize Enterpret. Takes about five seconds, and the difference in answer quality is real.
.png)
Where It's Available Today
Claude Desktop, Claude.ai, and the Claude API — Native connector in the Claude Connector Directory. Click Add. Works for conversational and agentic use.
Cursor — One-click from your Enterpret dashboard (Settings > Entepret MCP > Connect), or manual config via npx mcp-remote. Cursor's MCP Directory listing is coming soon as a third install path.
Notion — Via Notion's custom agent connector. Query Enterpret while drafting PRDs, no tab-switching required.
Glean — Available in Glean Gallery. Customer feedback joins Glean's internal knowledge index, the same cited, grounded answers you'd get searching an internal doc.
n8n — Embed Enterpret queries into automated pipelines and scheduled workflows.
Custom apps — Connection URL: https://enterpret-api.enterpret.com/server/mcp. Supports OAuth and Bearer token auth.
ChatGPT, n8n, LibreChat integrations are coming soon.
What This Looks Like in Practice
A PM asks Claude what drove enterprise churn last quarter. Claude queries Enterpret, pulls verbatim examples with source citations, and drafts the exec summary, without leaving the chat.
A developer spots a bug in the iOS checkout flow. Before touching the code, they ask how many customers reported it and whether it's concentrated in any account tier. Enterpret returns the count, the breakdown, the timeline. They make a different triage call.
An n8n agent runs every Monday morning, checks Enterpret for new complaint patterns from the prior week, cross-references open Jira tickets, and drops a prioritized digest into #product-feedback in Slack.
None of these required a dashboard login or a separate context switch.
Want to see it in action first? Request a demo and we'll walk you through it.
Get Started
Setup takes a few clicks — from an MCP gallery, your Enterpret dashboard, or the connection URL above.
We've spent five years on the problem of getting customer signal to the people who need it when they need it. An MCP server is the most direct version of that we've shipped.
Read more in our help center here >


