ESC

Start typing to search...

Back to Blog

Tool Integration via MCP: How We Connect Voice AI to External Services

The Model Context Protocol (MCP) standardizes how AI agents discover and use tools. Here's why we've built native support—and how it works.

Large language models can reason about anything. But they can't do anything without tool integration.

Ask an LLM to check your calendar, and it apologizes. Ask it to send an email, and it explains why it can't. The intelligence is there; the capability isn't.

This is the tool integration problem. And the Model Context Protocol (MCP) is emerging as the standard solution.


Why Tool Integration Is Hard

The simplest approach seems straightforward: give the AI functions to call.

const tools = {
  check_calendar: async (date) => {
    return await calendarAPI.getEvents(date);
  },
};

This fails at scale for several reasons:

The Discovery Problem

How does the AI know which tools exist? You can list them in the system prompt, but that doesn't scale. The AI needs to discover relevant tools dynamically based on conversation context.

"Check my calendar" should surface calendar tools. "Research this company" should surface search tools. "I'm stressed about the meeting tomorrow" might benefit from calendar tools, preparation tools, and coaching tools.

The Reliability Problem

Tool calls fail. APIs timeout. Auth tokens expire. Rate limits hit.

Building resilient tool integration requires:

  • Retry logic with exponential backoff
  • Graceful degradation when tools are unavailable
  • Error explanation that makes sense in conversation
  • Alternative suggestions when primary tools fail

Every AI developer implements these patterns from scratch.

The Security Problem

Tools act on behalf of users. They access sensitive data, make changes, spend money.

Questions every AI developer faces:

  • How do you authenticate tool calls?
  • How do you scope permissions?
  • How do you prevent prompt injection from triggering unintended tool calls?
  • How do you audit what tools did?

The Composition Problem

Real tasks require multiple tools. "Plan my trip to Paris" needs flight search, hotel booking, calendar checking, and payment processing—all coordinated.

Sequencing is hard. Error handling across sequences is harder. Rolling back partial completions when later steps fail is hardest.


How MCP Solves This

MCP's key insight: separate tool servers from AI systems.

Instead of building integrations directly into AI applications, MCP defines a protocol for external servers that expose tools. Any AI can connect to any MCP server. Tool providers build once, work everywhere.

An MCP server exposes three types of capabilities:

Tools: Actions the AI can take

calendar.getEvents(date) - Read calendar
email.send(to, subject, body) - Send email
crm.lookupCustomer(query) - Search CRM

Resources: Data the AI can read

file://reports/quarterly.pdf - Read a file
db://customers/recent - Query a database

Prompts: Reusable templates

summarize-meeting - Standard meeting summary format
write-followup - Template for follow-up emails

When an AI connects to an MCP server, it discovers what's available. The server describes each tool's purpose, parameters, and return types. The AI uses this to decide when and how to call tools.


Why MCP Matters

The surface benefit is standardization. Build an MCP server once, use it with any AI.

The deeper implications are more significant:

Tools Become a Marketplace

With a standard protocol, tools can be distributed independently of AI applications. A CRM vendor can publish an MCP server that works with any AI system. Users install it without AI vendor involvement.

This unbundles tool integration from AI development.

Security Gets Solved Once

MCP defines standard patterns for authentication, authorization, and auditing. Instead of every application inventing security, best practices emerge at the protocol level.

Vertical Specialization Accelerates

Building industry-specific AI today requires significant integration work. Healthcare AI needs EHR integration. Legal AI needs document management.

With MCP, these integrations become reusable. A healthcare MCP server can integrate with Epic, Cerner, and Meditech—and any AI application can use it.


Our Implementation

Ferni supports MCP natively. Register an MCP server and its tools become voice-callable.

But we've addressed patterns MCP doesn't cover:

Semantic Tool Discovery

MCP tells the AI what tools exist. It doesn't help decide which tools are relevant.

Our semantic layer (src/tools/dynamic-tool-router.ts) maps conversation intent to tool relevance. When you mention stress about tomorrow, we identify that calendar tools, preparation tools, and coaching tools are all potentially relevant—even though you didn't mention them.

The tool orchestrator (src/tools/orchestrator/unified-tool-orchestrator.ts) handles per-turn selection:

// Request comes in
const request = 'Help me process grief';

// Intent detection identifies domains
const intent = await detectToolIntent(request);
// → { domains: ["grief"], confidence: 0.9 }

// Lazy-load relevant domain
const tools = await getToolsForDomains(['grief']);

// Semantic router scores by relevance
const selected = await semanticRouter.score(tools, request);

// Top tools sent to LLM
// → processGrief(), exploreFeelings(), suggestResources()

Automatic Error Recovery

MCP defines success paths. We've defined failure handling:

  • Automatic retry: Exponential backoff on transient failures
  • Circuit breakers: Repeatedly failing tools are disabled temporarily
  • Graceful degradation: Alternative tools surface when primary tools fail
  • Conversational errors: "Let me try another way" rather than error messages

Tool Composition Engine

Complex tasks require coordination. Our composition engine handles:

  • Sequential execution: Tools run in order with results passing between them
  • Parallel execution: Independent tools run simultaneously
  • Conditional execution: Tool choice based on intermediate results
  • Transaction management: Rollback on failure

Connecting an MCP Server

Registration takes minutes:

1. Register your server:

curl -X POST https://api.ferni.ai/api/v2/developers/mcp-servers \
  -H "Authorization: Bearer $API_KEY" \
  -d '{
    "name": "my-crm",
    "endpoint": "https://mcp.yourservice.com",
    "transport": "http"
  }'

2. Test the connection:

curl -X POST https://api.ferni.ai/api/v2/developers/mcp-servers/{id}/test \
  -H "Authorization: Bearer $API_KEY"

3. Start talking:

Once connected, tools are voice-callable. Say "look up customer Acme Corp" and it works.

No prompt engineering. No integration code. The platform handles discovery, parameter extraction, and natural language responses.


Where This Goes

MCP is early. The specification is evolving. But the direction is clear: standardized tool integration is coming.

We expect:

  • Major AI vendors to announce MCP support
  • Tool marketplaces to emerge
  • Security standards to solidify
  • Orchestration primitives to enter the spec

We're building for this future. Native MCP support, semantic discovery, composition engines—these position us for what's coming.


Strategic Takeaway

If you're building AI applications:

Support MCP now. Even if adoption is limited today, the trajectory is clear. Building on MCP means your integrations work with tomorrow's AI systems.

Build MCP servers if you have data or capabilities. Exposing an MCP interface is an investment in distribution across the AI ecosystem.

Think about tools as a layer. The companies that win in AI won't just have the best models. They'll have the best tool ecosystems.


Resources


Seth Ford is Ferni's AI babysitter. Follow @ferni_ai for more on the future of conversational AI.