Sentry Mcp Experimental
Plugin Verified ActivePart of:Sentry Mcp
Sentry MCP with experimental features enabled
Purpose
To enable AI coding assistants to effectively interact with Sentry for debugging and workflow optimization.
Features
- Integrates with Sentry API for error tracking and performance monitoring.
- Provides tools for searching, analyzing, and triaging Sentry issues.
- Supports both cloud and self-hosted Sentry instances.
- Configurable via environment variables and a JSON configuration file.
Use Cases
- Use when troubleshooting errors or performance bottlenecks reported in Sentry.
- Use when needing to investigate Sentry issues directly from an AI coding assistant.
- Use when managing Sentry projects, teams, or releases via an AI agent.
Non-Goals
- This is not a general-purpose MCP server for all Sentry functionality.
- It does not replace the full Sentry UI or its advanced features.
- It does not manage Sentry billing or user account administration.
Trust
- info:Issues AttentionThere were 21 issues opened and 31 closed in the last 90 days. The closure rate is approximately 60%, indicating active maintainer attention to issues.
Execution
- info:Pinned dependenciesThe README mentions using `pnpm` and `@sentry/mcp-server@latest`, suggesting dependency management but not explicit pinning or lockfiles being discussed in detail for the main plugin installation.
Installation
First, add the marketplace
/plugin marketplace add getsentry/sentry-mcp/plugin install sentry-mcp-experimental@sentry-mcpQuality Score
Verified96 /100
Analyzed about 24 hours ago
Trust Signals
Last commit3 days ago
GitHub owner getsentry (opens in new tab)
Stars686
LicenseNOASSERTION
Status
Similar Extensions
Microsoft Learn MCP Server
100Access official Microsoft documentation, API references, and code samples for Azure, .NET, Windows, and more.
Plugin
MicrosoftDocs
Context7 Plugin
100Upstash Context7 MCP server for up-to-date documentation lookup. Pull version-specific documentation and code examples directly from source repositories into your LLM context.
Plugin
upstash
Llm Cost Optimizer
99Use when you need to reduce LLM API spend, control token usage, route between models by cost/quality, implement prompt caching, or build cost observability for AI features. Triggers: 'my AI costs are
Plugin
alirezarezvani