[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"extension-skill-aliengiraffe-spotdb-tr":3,"guides-for-aliengiraffe-spotdb":233,"similar-k17231zep11befm3g43rsa1yv5864trn":234},{"_creationTime":4,"_id":5,"children":6,"community":7,"display":9,"evaluation":25,"identity":192,"isFallback":196,"parentExtension":197,"providers":198,"relations":203,"repo":205,"workflow":229},1777995620896.9917,"k17231zep11befm3g43rsa1yv5864trn",[],{"reviewCount":8},0,{"description":10,"installMethods":11,"name":13,"sourceUrl":14,"tags":15},"Extension from aliengiraffe/spotdb",{"docker":12},"aliengiraffe/spotdb","SpotDB","https://github.com/aliengiraffe/spotdb",[16,17,18,19,20,21,22,23,24],"database","sql","duckdb","go","data-analysis","sandbox","mcp","api","csv",{"_creationTime":26,"_id":27,"extensionId":5,"locale":28,"result":29,"trustSignals":178,"workflow":190},1777995897177.4507,"kn779affvzavhpb9tr098n36mh8647s5","en",{"checks":30,"evaluatedAt":168,"extensionSummary":169,"promptVersionExtension":170,"promptVersionScoring":171,"rationale":172,"score":173,"summary":174,"tags":175,"targetMarket":176,"tier":177},[31,36,39,42,46,49,53,57,60,63,67,71,74,78,81,84,87,90,93,96,100,104,108,112,116,119,122,125,129,132,135,138,141,144,148,152,155,158,161,165],{"category":32,"check":33,"severity":34,"summary":35},"Practical Utility","Problem relevance","pass","The description clearly states the extension's purpose: to provide a secure, ephemeral SQL sandbox for AI workflows and data exploration, enabling safe AI data analysis without risk to production systems.",{"category":32,"check":37,"severity":34,"summary":38},"Unique selling proposition","The extension offers a unique selling proposition by providing a secure, ephemeral SQL sandbox with a focus on AI integration (MCP support), data privacy (ephemeral storage), and ease of use (auto CSV parsing, instant analytics), going beyond a simple API wrapper.",{"category":32,"check":40,"severity":34,"summary":41},"Production readiness","The extension appears production-ready with features like MCP integration, REST API, DuckDB core, snapshotting with S3 integration, security validation, rate limiting, and a clear quick start guide.",{"category":43,"check":44,"severity":34,"summary":45},"Scope","Single responsibility principle","The extension has a single, well-defined responsibility: providing a secure, ephemeral SQL sandbox primarily for AI workflows and data exploration, with clear boundaries around its capabilities.",{"category":43,"check":47,"severity":34,"summary":48},"Description quality","The description is concise, readable, and accurately reflects the extension's capabilities, including its core features, data access methods, security aspects, and technical foundation.",{"category":50,"check":51,"severity":34,"summary":52},"Invocation","Scoped tools","The extension exposes specific tools like `read_query`, `write_query`, `create_datasource`, `list_datasources`, `describe_datasource`, and `append_insight`, which are narrow verb-noun specialists.",{"category":54,"check":55,"severity":34,"summary":56},"Documentation","Configuration & parameter reference","All environment variables for configuration are documented with their purpose and default values, and their precedence is clear.",{"category":43,"check":58,"severity":34,"summary":59},"Tool naming","All user-facing tool names (`read_query`, `write_query`, etc.) are descriptive, easy to understand, and follow kebab-case conventions.",{"category":43,"check":61,"severity":34,"summary":62},"Minimal I/O surface","The tool parameters (e.g., `query` for `read_query`) are specific to the task, and the API responses are structured and do not contain unnecessary diagnostic dumps.",{"category":64,"check":65,"severity":34,"summary":66},"License","License usability","The extension is licensed under the MIT license, which is a permissive open-source license.",{"category":68,"check":69,"severity":34,"summary":70},"Maintenance","Commit recency","The last commit was 5 days ago, indicating active maintenance.",{"category":68,"check":72,"severity":34,"summary":73},"Dependency Management","The project uses Go modules and has pre-commit hooks configured with golangci-lint and gofmt, suggesting good dependency management practices.",{"category":75,"check":76,"severity":34,"summary":77},"Security","Secret Management","No hardcoded secrets are present in the committed files. Secrets handling relies on environment variables or AWS credentials for S3 operations, which is standard practice.",{"category":75,"check":79,"severity":34,"summary":80},"Injection","The `validateQuery` function appears to sanitize SQL queries by checking for malicious patterns, and table names are sanitized in `ExecuteQueryWithTableName` and other database interactions.",{"category":75,"check":82,"severity":34,"summary":83},"Transitive Supply-Chain Grenades","The code does not appear to fetch or execute external code at runtime, and all dependencies are managed via Go modules. No symlinks or remote pipes to shell are apparent.",{"category":75,"check":85,"severity":34,"summary":86},"Sandbox Isolation","The application runs in a Docker container and primarily interacts with its own temporary database files and specified ports. No obvious attempts to access files outside the project folder or user-specific scopes were found.",{"category":75,"check":88,"severity":34,"summary":89},"Sandbox escape primitives","No detached process spawns (`nohup`, `&`), retry loops around denied calls, or background child processes that outlive the hook were detected in the bundled scripts.",{"category":75,"check":91,"severity":34,"summary":92},"Data Exfiltration","No imperative instructions aimed at submitting confidential data to third parties were found. Outbound calls are primarily related to S3 for snapshots, which is expected and documented.",{"category":75,"check":94,"severity":34,"summary":95},"Hidden Text Tricks","Bundled files appear to be free of hidden-steering tricks like HTML comments smuggling instructions or invisible Unicode characters.",{"category":97,"check":98,"severity":34,"summary":99},"Hooks","Opaque code execution","The bundle includes only plain, readable Go source code. There is no evidence of obfuscation, base64-decoded payloads, or runtime code fetching.",{"category":101,"check":102,"severity":34,"summary":103},"Portability","Structural Assumption","The application uses temporary directories for its database file, avoiding assumptions about fixed project structures. It handles its own lifecycle and temporary file management.",{"category":105,"check":106,"severity":34,"summary":107},"Trust","Issues Attention","There are 0 issues opened (currently open) and 0 issues closed in the last 90 days, indicating no active issues or slow response.",{"category":109,"check":110,"severity":34,"summary":111},"Versioning","Release Management","A `version` field is present in `server.json` and `README.md` references the full documentation, indicating clear versioning.",{"category":113,"check":114,"severity":34,"summary":115},"Code Execution","Validation","Input arguments like table names and queries are sanitized and validated. The `validateQuery` function checks for malicious patterns, and table names are sanitized using `sanitizeTableName`.",{"category":75,"check":117,"severity":34,"summary":118},"Unguarded Destructive Operations","Operations like `DROP TABLE` are guarded by checks for table existence and the `override` flag, and table names are sanitized to prevent injection.",{"category":113,"check":120,"severity":34,"summary":121},"Error Handling","Errors during database operations, file handling, and network requests are caught, logged, and returned with meaningful messages and appropriate HTTP status codes or structured error responses.",{"category":113,"check":123,"severity":34,"summary":124},"Logging","The application logs actions, errors, and metrics using `slog` and includes request logging middleware, providing a good audit trail.",{"category":126,"check":127,"severity":34,"summary":128},"Compliance","GDPR","The extension operates on user-uploaded data in an ephemeral sandbox. No direct personal data handling is evident beyond what is processed for query execution, and it does not submit personal data to third parties.",{"category":126,"check":130,"severity":34,"summary":131},"Target market","The extension is designed as a general-purpose data sandbox tool and does not appear to have regional limitations or encode jurisdiction-specific rules, so the target market is global.",{"category":101,"check":133,"severity":34,"summary":134},"Runtime stability","The application is built with Go and Docker, aiming for cross-platform compatibility. It uses standard libraries and does not appear to make specific OS or shell assumptions beyond typical Go build environments.",{"category":50,"check":136,"severity":34,"summary":137},"Precise Purpose","The description clearly states the purpose (ephemeral SQL sandbox for AI) and implicitly defines scope by listing features like MCP integration and direct API access for CSV uploads.",{"category":50,"check":139,"severity":34,"summary":140},"Concise Frontmatter","The `server.json` frontmatter is concise and provides essential metadata like name, description, repository, version, and package details.",{"category":54,"check":142,"severity":34,"summary":143},"Concise Body","The `DOCS.md` file provides comprehensive details without being excessively long, with implementation details and API usage clearly separated.",{"category":145,"check":146,"severity":34,"summary":147},"Context","Progressive Disclosure","The README links to `DOCS.md` for full documentation, and the code structure itself appears modular, suggesting progressive disclosure of information.",{"category":145,"check":149,"severity":150,"summary":151},"Forked exploration","not_applicable","The skill does not appear to involve heavy exploration or multi-file inspection that would necessitate `context: fork`.",{"category":32,"check":153,"severity":34,"summary":154},"Usage examples","The `DOCS.md` and `README.md` provide clear, end-to-end examples for basic setup, Docker usage, API calls, and Claude Desktop integration.",{"category":32,"check":156,"severity":34,"summary":157},"Edge cases","The `DOCS.md` and `API` code detail handling of file size limits, CSV validation modes (security, encoding), rate limiting, and potential database errors.",{"category":113,"check":159,"severity":150,"summary":160},"Tool Fallback","The extension does not rely on external tools like MCP servers; it bundles its own Go implementation and uses standard libraries.",{"category":162,"check":163,"severity":34,"summary":164},"Safety","Halt on unexpected state","The application handles errors gracefully during database operations, file processing, and network requests, returning appropriate error messages and status codes rather than halting unexpectedly.",{"category":101,"check":166,"severity":150,"summary":167},"Cross-skill coupling","This extension is a standalone server and does not appear to implicitly rely on other skills being loaded or coordinate with them.",1777995836298,"This extension provides a secure, ephemeral SQL sandbox for AI workflows and data exploration using Go and DuckDB. It offers MCP integration for AI assistants like Claude, a REST API for CSV uploads and queries, snapshotting to S3, and built-in security guardrails such as rate limiting and CSV injection prevention.","2.0.0","3.4.0","The extension is highly production-ready, well-documented, and actively maintained with robust security practices. All checks passed, indicating a high-quality and trustworthy tool.",98,"A high-quality, production-ready data sandbox tool with excellent documentation, security features, and active maintenance.",[16,17,18,19,20,21,22,23,24],"global","verified",{"codeQuality":179,"collectedAt":180,"documentation":181,"maintenance":183,"popularity":185,"security":186,"testCoverage":189},{},1777995826855,{"descriptionLength":182,"readmeSize":8},34,{"closedIssues90d":8,"openIssues90d":8,"pushedAt":184},1777494962000,{"smitheryUniqueUsers":8,"smitheryUseCount":8},{"hasNpmPackage":187,"license":188,"smitheryVerified":187},false,"MIT",{"hasCi":187,"hasTests":187},{"updatedAt":191},1777995897177,{"githubOwner":193,"githubRepo":194,"locale":28,"slug":194,"type":195},"aliengiraffe","spotdb","skill",true,null,{"extract":199,"llm":201,"smithery":202},{"commitSha":200,"license":188},"cfbbef27f89d18939149790a0fa9ce1ee2c5eac5",{"promptVersionExtension":170,"promptVersionScoring":171,"score":173,"targetMarket":176,"tier":177},{"qualityScore":8,"totalActivations":8,"uniqueUsers":8,"useCount":8,"verified":187},{"repoId":204},"kd72fk7ta378vyy81k8hqp5rs5864hzf",{"_creationTime":206,"_id":204,"identity":207,"providers":208,"workflow":226},1777995551258.0476,{"githubOwner":193,"githubRepo":194,"sourceUrl":14},{"discover":209,"github":212},{"sources":210},[211],"mcp-registry",{"closedIssues90d":8,"forks":213,"license":188,"openIssues90d":8,"pushedAt":184,"readmeSize":214,"stars":215,"topics":216},4,2427,20,[217,218,219,220,221,222,16,223,18,224,225],"agentic","agentic-ai","agentic-workflow","agents","ai","ai-agents","docker","lang-chain","llm",{"discoverAt":227,"extractAt":228,"githubAt":228,"updatedAt":228},1777995551258,1777995647021,{"anyEnrichmentAt":230,"extractAt":231,"githubAt":232,"llmAt":191,"smitheryAt":230,"updatedAt":191},1777995723550,1777995620897,1777995621254,[],[235,262,292,323,351,383],{"_creationTime":236,"_id":237,"community":238,"display":239,"identity":248,"providers":251,"relations":256,"workflow":258},1778053100136.2417,"k172f9k8w7xg3bzb9t320dj0cn866517",{"reviewCount":8},{"description":240,"installMethods":241,"name":242,"sourceUrl":243,"tags":244},"Use this skill when the user uploads Excel (.xlsx/.xls) or CSV files and wants to perform data analysis, generate statistics, create summaries, pivot tables, SQL queries, or any form of structured data exploration. Supports multi-sheet Excel workbooks, aggregation, filtering, joins, and exporting results to CSV/JSON/Markdown.",{},"Data Analysis Skill","https://github.com/bytedance/deer-flow/tree/HEAD/skills/public/data-analysis",[20,245,24,17,18,246,247],"excel","python","analytics",{"githubOwner":249,"githubRepo":250,"locale":28,"slug":20,"type":195},"bytedance","deer-flow",{"extract":252,"llm":254},{"commitSha":253,"license":188},"1336872b15c25d45ebcb7c1cf72369c2bdd53187",{"promptVersionExtension":170,"promptVersionScoring":171,"score":255,"targetMarket":176,"tier":177},96,{"repoId":257},"kd789sm7egx1h0t1jag6zzhcq98656wv",{"anyEnrichmentAt":259,"extractAt":260,"githubAt":259,"llmAt":261,"updatedAt":261},1778053101076,1778053100136,1778053169012,{"_creationTime":263,"_id":264,"community":265,"display":266,"identity":278,"providers":281,"relations":286,"workflow":288},1778053689272.9238,"k17a5hw81fhwybk1wxavs6mvjs8676ca",{"reviewCount":8},{"description":267,"installMethods":268,"name":269,"sourceUrl":270,"tags":271},"Set up a new Prisma Postgres database and connect it to a local project using the Management API. Use when asked to \"set up a database\", \"create a Prisma Postgres project\", \"get a connection string\", \"connect my app to Prisma Postgres\", or \"provision a database\".",{},"Prisma Postgres Setup","https://github.com/prisma/skills/tree/HEAD/prisma-postgres-setup",[272,273,16,274,275,23,276,277],"prisma","postgres","setup","connection","typescript","node-js",{"githubOwner":272,"githubRepo":279,"locale":28,"slug":280,"type":195},"skills","prisma-postgres-setup",{"extract":282,"llm":284},{"commitSha":283,"license":188},"741a74fdafc1bf61fa208c2f73878be688cba263",{"promptVersionExtension":170,"promptVersionScoring":171,"score":285,"targetMarket":176,"tier":177},99,{"repoId":287},"kd76h7swxyhk8405svecsqq7gh864y5s",{"anyEnrichmentAt":289,"extractAt":290,"githubAt":289,"llmAt":291,"updatedAt":291},1778053689723,1778053689272,1778053716548,{"_creationTime":293,"_id":294,"community":295,"display":296,"identity":305,"providers":309,"relations":316,"workflow":318},1778053968286.4934,"k172smq0fxy5e4spbwss5n7xes866z8p",{"reviewCount":8},{"description":297,"name":298,"sourceUrl":299,"tags":300},"Writes and executes SQL queries from simple SELECTs to complex multi-table JOINs, aggregations, and subqueries. Use when the user asks to query a database, write SQL, run a SELECT statement, retrieve data, filter records, or generate reports from database tables.","Query Writing Skill","https://github.com/langchain-ai/deepagents/tree/HEAD/examples/text-to-sql-agent/skills/query-writing",[301,302,17,16,303,304],"data-analytics","coding","query","code-generation",{"githubOwner":306,"githubRepo":307,"locale":28,"slug":308,"type":195},"langchain-ai","deepagents","query-writing",{"extract":310,"llm":312,"smithery":313},{"commitSha":311,"license":188},"b108c71d0c570e16c7050c1eac482e15dc35a5ed",{"promptVersionExtension":170,"promptVersionScoring":171,"score":173,"targetMarket":176,"tier":177},{"qualityScore":314,"totalActivations":315,"uniqueUsers":315,"useCount":8,"verified":187},0.67405933,1,{"repoId":317},"kd76dna2fvfbnjvzcpd2cwqnyd865xz7",{"anyEnrichmentAt":319,"extractAt":320,"githubAt":321,"llmAt":322,"smitheryAt":319,"updatedAt":322},1778053994907,1778053968286,1778053969344,1778054053159,{"_creationTime":324,"_id":325,"community":326,"display":327,"identity":338,"providers":341,"relations":345,"workflow":347},1778053148350.4817,"k1799ke3mvvmb9chq1vt0k97k5867cfv",{"reviewCount":8},{"description":328,"installMethods":329,"name":330,"sourceUrl":331,"tags":332},"Build and manage webhook-based integrations for real-time event processing and API connections",{},"Webhook Automation","https://github.com/claude-office-skills/skills/tree/HEAD/webhook-automation",[333,23,334,335,336,337,22],"webhook","integration","automation","events","engineering",{"githubOwner":339,"githubRepo":279,"locale":28,"slug":340,"type":195},"claude-office-skills","webhook-automation",{"extract":342,"llm":344},{"commitSha":343,"license":188},"9c4c7d5cd2813a8936bf2c9fdb174ea883b85a11",{"promptVersionExtension":170,"promptVersionScoring":171,"score":173,"targetMarket":176,"tier":177},{"repoId":346},"kd7fw7xbj58qc2z8whrrjptbed8659db",{"anyEnrichmentAt":348,"extractAt":349,"githubAt":348,"llmAt":350,"updatedAt":350},1778053151766,1778053148350,1778053561145,{"_creationTime":352,"_id":353,"community":354,"display":355,"identity":368,"providers":372,"relations":377,"workflow":379},1778054812528.722,"k175exhcf750rkmxhzjvwzw07s866exs",{"reviewCount":8},{"description":356,"name":357,"sourceUrl":358,"tags":359},"Production backend systems development. Stack: Node.js/TypeScript, Python, Go, Rust | NestJS, FastAPI, Django, Express | PostgreSQL, MongoDB, Redis. Capabilities: REST/GraphQL/gRPC APIs, OAuth 2.1/JWT auth, OWASP security, microservices, caching, load balancing, Docker/K8s deployment. Actions: design, build, implement, secure, optimize, deploy, test APIs and services. Keywords: API design, REST, GraphQL, gRPC, authentication, OAuth, JWT, RBAC, database, PostgreSQL, MongoDB, Redis, caching, microservices, Docker, Kubernetes, CI/CD, OWASP, security, performance, scalability, NestJS, FastAPI, Express, middleware, rate limiting. Use when: designing APIs, implementing auth/authz, optimizing queries, building microservices, securing endpoints, deploying containers, setting up CI/CD.","Backend Development","https://github.com/samhvw8/dot-claude/tree/HEAD/skills/backend-development",[360,276,246,19,361,23,16,362,363,364,365,366,367],"backend","rust","security","devops","testing","architecture","performance","documentation",{"githubOwner":369,"githubRepo":370,"locale":28,"slug":371,"type":195},"samhvw8","dot-claude","backend-development",{"extract":373,"llm":375},{"commitSha":374,"license":188},"28c76162116d2eedab131c0e1548fdc76a2999f7",{"promptVersionExtension":170,"promptVersionScoring":171,"score":376,"targetMarket":176,"tier":177},95,{"repoId":378},"kd79ad9dpqazy79y2s6rvajgjn865xek",{"anyEnrichmentAt":380,"extractAt":381,"githubAt":380,"llmAt":382,"updatedAt":382},1778054813688,1778054812528,1778054896678,{"_creationTime":384,"_id":385,"community":386,"display":387,"identity":396,"providers":398,"relations":404,"workflow":405},1778053968286.494,"k176w7f9ax59g2xqy6243c9pk1866f9s",{"reviewCount":8},{"description":388,"installMethods":389,"name":390,"sourceUrl":391,"tags":392},"Lists tables, describes columns and data types, identifies foreign key relationships, and maps entity relationships in a database. Use when the user asks about database schema, table structure, column types, what tables exist, ERD, foreign keys, or how entities relate.",{},"Schema Exploration","https://github.com/langchain-ai/deepagents/tree/HEAD/examples/text-to-sql-agent/skills/schema-exploration",[301,393,16,17,394,395,367],"research","schema","exploration",{"githubOwner":306,"githubRepo":307,"locale":28,"slug":397,"type":195},"schema-exploration",{"extract":399,"llm":400,"smithery":401},{"commitSha":311,"license":188},{"promptVersionExtension":170,"promptVersionScoring":171,"score":376,"targetMarket":176,"tier":177},{"qualityScore":402,"totalActivations":403,"uniqueUsers":403,"useCount":8,"verified":187},0.69166845,2,{"repoId":317},{"anyEnrichmentAt":319,"extractAt":320,"githubAt":321,"llmAt":322,"smitheryAt":319,"updatedAt":322}]