Zum Hauptinhalt springen
Dieser Inhalt ist noch nicht in Ihrer Sprache verfügbar und wird auf Englisch angezeigt.

Build Custom Mcp Server

Skill Verifiziert Aktiv
Teil von:Agent Almanac

Build a custom MCP (Model Context Protocol) server that exposes domain-specific tools to AI assistants. Covers server implementation in Node.js or R, tool definitions, transport configuration, and testing with Claude Code. Use when you need to expose custom functionality beyond what mcptools provides, when building specialized domain-specific AI integrations, or when wrapping existing APIs or services as MCP tools.

Zweck

To empower users to build specialized AI integrations by exposing custom functionality or wrapping existing APIs as MCP tools, extending beyond default capabilities.

Funktionen

  • Build custom MCP servers in Node.js or R
  • Define domain-specific tools with parameters and return types
  • Configure transport (stdio/HTTP) and authentication
  • Implement error handling and validation
  • Package server for distribution and test with AI assistants

Anwendungsfälle

  • Exposing custom functionality to AI assistants when existing tools are insufficient
  • Creating specialized domain-specific AI integrations
  • Wrapping existing APIs or services as MCP tools for AI consumption
  • Enabling AI assistants to interact with proprietary systems

Nicht-Ziele

  • Building the AI assistant client that consumes the MCP server
  • Providing pre-built MCP servers for common domains
  • Detailed guidance on specific API wrapping scenarios beyond MCP implementation

Installation

/plugin install agent-almanac@pjt222-agent-almanac

Qualitätspunktzahl

Verifiziert
100 /100
Analysiert about 18 hours ago

Vertrauenssignale

Letzter Commit1 day ago
Sterne14
LizenzMIT
Status
Quellcode ansehen

Ähnliche Erweiterungen

Containerize MCP Server

100

Containerize an R-based MCP (Model Context Protocol) server using Docker. Covers mcptools integration, port exposure, stdio vs HTTP transport, and connecting Claude Code to the containerized server. Use when deploying an R MCP server without requiring a local R installation, creating a reproducible MCP server environment, running MCP servers alongside other containerized services, or distributing an MCP server to other developers.

Skill
pjt222

Optimize Docker Build Cache

99

Optimize Docker build times using layer caching, multi-stage builds, BuildKit features, and dependency-first copy patterns. Applicable to R, Node.js, and Python projects. Use when Docker builds are slow due to repeated package installations, when rebuilds reinstall all dependencies on every code change, when image sizes are unnecessarily large, or when CI/CD pipeline builds are a bottleneck.

Skill
pjt222

Implement A2a Server

99

Implement a JSON-RPC 2.0 A2A server with full task lifecycle management (submitted/working/completed/failed/canceled/input-required), SSE streaming, and push notifications. Use when implementing an agent that participates in multi-agent A2A workflows, building a backend for an Agent Card, adding A2A protocol support to an existing agent or service, or deploying an agent that must interoperate with other A2A-compliant agents.

Skill
pjt222

Claude Mcp Expert

98

Model Context Protocol (MCP) expert for Claude Code. Install, configure, and troubleshoot MCP servers. Covers HTTP, SSE, and stdio transports, authentication, popular integrations (Sentry, GitHub, Jira, Notion, databases). Triggers on MCP, Model Context Protocol, MCP server, installing MCP, connecting tools, webhooks, remote server.

Skill
raintree-technology

Salesforce Developer

100

Writes and debugs Apex code, builds Lightning Web Components, optimizes SOQL queries, implements triggers, batch jobs, platform events, and integrations on the Salesforce platform. Use when developing Salesforce applications, customizing CRM workflows, managing governor limits, bulk processing, or setting up Salesforce DX and CI/CD pipelines.

Skill
jeffallan

Arize Ai Provider Integration

100

Creates, reads, updates, and deletes Arize AI integrations that store LLM provider credentials used by evaluators and other Arize features. Supports any LLM provider (e.g. OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, Vertex AI, Gemini, NVIDIA NIM). Use when the user mentions AI integration, LLM provider credentials, create integration, list integrations, update credentials, delete integration, or connecting an LLM provider to Arize.

Skill
github