[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"extension-plugin-firecrawl-cli-en":3,"guides-for-firecrawl-cli":654,"similar-k17axfavjpz72cd3qqzn86shb186ncqt-en":655},{"_creationTime":4,"_id":5,"children":6,"community":217,"display":218,"evaluation":221,"identity":460,"isFallback":449,"parentExtension":463,"providers":486,"relations":492,"repo":493,"tags":652,"workflow":653},1778686940775.5706,"k17axfavjpz72cd3qqzn86shb186ncqt",[7,45,67,86,107,129,152,176,197],{"_creationTime":8,"_id":9,"community":10,"display":12,"identity":18,"providers":24,"relations":39,"tags":41,"workflow":42},1778686940775.5708,"k17a0qa1tnktamw8db9gxvtk4986mjbh",{"reviewCount":11},0,{"description":13,"installMethods":14,"name":16,"sourceUrl":17},"AI-powered autonomous data extraction that navigates complex sites and returns structured JSON. Use this skill when the user wants structured data from websites, needs to extract pricing tiers, product listings, directory entries, or any data as JSON with a schema. Triggers on \"extract structured data\", \"get all the products\", \"pull pricing info\", \"extract as JSON\", or when the user provides a JSON schema for website data. More powerful than simple scraping for multi-page structured extraction.\n",{"claudeCode":15},"firecrawl/cli","firecrawl-agent","https://github.com/firecrawl/cli",{"basePath":19,"githubOwner":20,"githubRepo":21,"locale":22,"slug":16,"type":23},"skills/firecrawl-agent","firecrawl","cli","en","skill",{"evaluate":25,"extract":37},{"promptVersionExtension":26,"promptVersionScoring":27,"score":28,"tags":29,"targetMarket":35,"tier":36},"3.0.0","4.4.0",95,[30,31,32,21,33,34],"data-extraction","web-scraping","json","ai","automation","global","community",{"commitSha":38},"HEAD",{"parentExtensionId":5,"repoId":40},"kd7csd1wb06dg9c1jfy5063f2586ne60",[33,34,21,30,32,31],{"evaluatedAt":43,"extractAt":44,"updatedAt":43},1778687006207,1778686940775,{"_creationTime":46,"_id":47,"community":48,"display":49,"identity":53,"providers":56,"relations":63,"tags":64,"workflow":65},1778686940775.571,"k173y2bqmv365qfxv0rebgfmxd86n50r",{"reviewCount":11},{"description":50,"installMethods":51,"name":52,"sourceUrl":17},"Search, scrape, and interact with the web via the Firecrawl CLI. Use this skill whenever the user wants to search the web, find articles, research a topic, look something up online, scrape a webpage, grab content from a URL, get data from a website, crawl documentation, download a site, or interact with pages that need clicks or logins. Also use when they say \"fetch this page\", \"pull the content from\", \"get the page at https://\", or reference external websites. This provides real-time web search with full page content and interact capabilities — beyond what Claude can do natively with built-in tools. Do NOT trigger for local file operations, git commands, deployments, or code editing tasks.\n",{"claudeCode":15},"Firecrawl CLI",{"basePath":54,"githubOwner":20,"githubRepo":21,"locale":22,"slug":55,"type":23},"skills/firecrawl-cli","firecrawl-cli",{"evaluate":57,"extract":61},{"promptVersionExtension":26,"promptVersionScoring":27,"score":58,"tags":59,"targetMarket":35,"tier":36},75,[31,21,60,34,30],"search",{"commitSha":38,"license":62},"MIT",{"parentExtensionId":5,"repoId":40},[34,21,30,60,31],{"evaluatedAt":66,"extractAt":44,"updatedAt":66},1778687034478,{"_creationTime":68,"_id":69,"community":70,"display":71,"identity":75,"providers":77,"relations":82,"tags":83,"workflow":84},1778686940775.5713,"k173h1mfmqg08wryv3g35k86hn86mmbs",{"reviewCount":11},{"description":72,"installMethods":73,"name":74,"sourceUrl":17},"Bulk extract content from an entire website or site section. Use this skill when the user wants to crawl a site, extract all pages from a docs section, bulk-scrape multiple pages following links, or says \"crawl\", \"get all the pages\", \"extract everything under /docs\", \"bulk extract\", or needs content from many pages on the same site. Handles depth limits, path filtering, and concurrent extraction.\n",{"claudeCode":15},"firecrawl-crawl",{"basePath":76,"githubOwner":20,"githubRepo":21,"locale":22,"slug":74,"type":23},"skills/firecrawl-crawl",{"evaluate":78,"extract":81},{"promptVersionExtension":26,"promptVersionScoring":27,"score":58,"tags":79,"targetMarket":35,"tier":36},[31,21,30,34,80],"web-crawling",{"commitSha":38},{"parentExtensionId":5,"repoId":40},[34,21,30,80,31],{"evaluatedAt":85,"extractAt":44,"updatedAt":85},1778687056905,{"_creationTime":87,"_id":88,"community":89,"display":90,"identity":93,"providers":96,"relations":103,"tags":104,"workflow":105},1778686940775.5715,"k17bexzyshtx1ecvmz2x6728nn86n0fa",{"reviewCount":11},{"description":91,"installMethods":92,"name":52,"sourceUrl":17},"Download an entire website as local files — markdown, screenshots, or multiple formats per page. Use this skill when the user wants to save a site locally, download documentation for offline use, bulk-save pages as files, or says \"download the site\", \"save as local files\", \"offline copy\", \"download all the docs\", or \"save for reference\". Combines site mapping and scraping into organized local directories.\n",{"claudeCode":15},{"basePath":94,"githubOwner":20,"githubRepo":21,"locale":22,"slug":95,"type":23},"skills/firecrawl-download","firecrawl-download",{"evaluate":97,"extract":102},{"promptVersionExtension":26,"promptVersionScoring":27,"score":98,"tags":99,"targetMarket":35,"tier":101},98,[31,100,21,30,34],"download","verified",{"commitSha":38,"license":62},{"parentExtensionId":5,"repoId":40},[34,21,30,100,31],{"evaluatedAt":106,"extractAt":44,"updatedAt":106},1778687082294,{"_creationTime":108,"_id":109,"community":110,"display":111,"identity":115,"providers":118,"relations":125,"tags":126,"workflow":127},1778686940775.5718,"k171m557sa526rn734zqq7hpe586m3b8",{"reviewCount":11},{"description":112,"installMethods":113,"name":114,"sourceUrl":17},"Control and interact with a live browser session on any scraped page — click buttons, fill forms, navigate flows, and extract data using natural language prompts or code. Use when the user needs to interact with a webpage beyond simple scraping: logging into a site, submitting forms, clicking through pagination, handling infinite scroll, navigating multi-step checkout or wizard flows, or when a regular scrape failed because content is behind JavaScript interaction. Also useful for authenticated scraping via profiles. Triggers on \"interact\", \"click\", \"fill out the form\", \"log in to\", \"sign in\", \"submit\", \"paginated\", \"next page\", \"infinite scroll\", \"interact with the page\", \"navigate to\", \"open a session\", or \"scrape failed\".\n",{"claudeCode":15},"Firecrawl Interact",{"basePath":116,"githubOwner":20,"githubRepo":21,"locale":22,"slug":117,"type":23},"skills/firecrawl-interact","firecrawl-interact",{"evaluate":119,"extract":124},{"promptVersionExtension":26,"promptVersionScoring":27,"score":98,"tags":120,"targetMarket":35,"tier":101},[121,31,122,21,123],"browser-automation","javascript-interaction","developer-tools",{"commitSha":38,"license":62},{"parentExtensionId":5,"repoId":40},[121,21,123,122,31],{"evaluatedAt":128,"extractAt":44,"updatedAt":128},1778687114384,{"_creationTime":130,"_id":131,"community":132,"display":133,"identity":137,"providers":139,"relations":148,"tags":149,"workflow":150},1778686940775.572,"k17c5kkgjkbd4f35bh8jv5js9h86n4gc",{"reviewCount":11},{"description":134,"installMethods":135,"name":136,"sourceUrl":17},"Discover and list all URLs on a website, with optional search filtering. Use this skill when the user wants to find a specific page on a large site, list all URLs, see the site structure, find where something is on a domain, or says \"map the site\", \"find the URL for\", \"what pages are on\", or \"list all pages\". Essential when the user knows which site but not which exact page.\n",{"claudeCode":15},"firecrawl-map",{"basePath":138,"githubOwner":20,"githubRepo":21,"locale":22,"slug":136,"type":23},"skills/firecrawl-map",{"evaluate":140,"extract":147},{"promptVersionExtension":26,"promptVersionScoring":27,"score":141,"tags":142,"targetMarket":35,"tier":101},97,[143,144,145,21,146],"web","scraping","url","discovery",{"commitSha":38},{"parentExtensionId":5,"repoId":40},[21,146,144,145,143],{"evaluatedAt":151,"extractAt":44,"updatedAt":151},1778687127164,{"_creationTime":153,"_id":154,"community":155,"display":156,"identity":160,"providers":162,"relations":172,"tags":173,"workflow":174},1778686940775.5723,"k17d40zvn2sfy64zvq7rzpzksh86mndd",{"reviewCount":11},{"description":157,"installMethods":158,"name":159,"sourceUrl":17},"Efficiently extract and convert the contents of any local file—such as PDF, DOCX, DOC, ODT, RTF, XLSX, XLS, or HTML—into clean, well-formatted markdown saved to disk. Use this skill whenever the user requests to parse, read, or extract information from a file on their computer, including phrases like “parse this PDF”, “convert this document”, “read this file”, “extract text from”, or when a local file path (not a URL) is provided. This skill offers advanced options like generating AI-powered summaries and answering questions based on the file's content. Prefer this tool over `scrape` when handling local files to deliver precise, structured outputs for downstream tasks.\n",{"claudeCode":15},"firecrawl-parse",{"basePath":161,"githubOwner":20,"githubRepo":21,"locale":22,"slug":159,"type":23},"skills/firecrawl-parse",{"evaluate":163,"extract":171},{"promptVersionExtension":26,"promptVersionScoring":27,"score":164,"tags":165,"targetMarket":35,"tier":101},99,[166,167,168,169,170,21],"file-conversion","document-parsing","markdown","pdf","docx",{"commitSha":38},{"parentExtensionId":5,"repoId":40},[21,167,170,166,168,169],{"evaluatedAt":175,"extractAt":44,"updatedAt":175},1778687175227,{"_creationTime":177,"_id":178,"community":179,"display":180,"identity":183,"providers":186,"relations":193,"tags":194,"workflow":195},1778686940775.5725,"k17chm1rbb9bh8xn1d9xbgyzwd86ms24",{"reviewCount":11},{"description":181,"installMethods":182,"name":52,"sourceUrl":17},"Extract clean markdown from any URL, including JavaScript-rendered SPAs. Use this skill whenever the user provides a URL and wants its content, says \"scrape\", \"grab\", \"fetch\", \"pull\", \"get the page\", \"extract from this URL\", or \"read this webpage\". Handles JS-rendered pages, multiple concurrent URLs, and returns LLM-optimized markdown. Use this instead of WebFetch for any webpage content extraction.\n",{"claudeCode":15},{"basePath":184,"githubOwner":20,"githubRepo":21,"locale":22,"slug":185,"type":23},"skills/firecrawl-scrape","firecrawl-scrape",{"evaluate":187,"extract":192},{"promptVersionExtension":26,"promptVersionScoring":27,"score":28,"tags":188,"targetMarket":35,"tier":36},[31,168,30,21,189,190,191],"spa","javascript-rendering","content-retrieval",{"commitSha":38,"license":62},{"parentExtensionId":5,"repoId":40},[21,191,30,190,168,189,31],{"evaluatedAt":196,"extractAt":44,"updatedAt":196},1778687209738,{"_creationTime":198,"_id":199,"community":200,"display":201,"identity":204,"providers":207,"relations":213,"tags":214,"workflow":215},1778686940775.5728,"k1763fmwvnx4jva5z2b3mfb75n86mhqf",{"reviewCount":11},{"description":202,"installMethods":203,"name":52,"sourceUrl":17},"Web search with full page content extraction. Use this skill whenever the user asks to search the web, find articles, research a topic, look something up, find recent news, discover sources, or says \"search for\", \"find me\", \"look up\", \"what are people saying about\", or \"find articles about\". Returns real search results with optional full-page markdown — not just snippets. Provides capabilities beyond Claude's built-in WebSearch.\n",{"claudeCode":15},{"basePath":205,"githubOwner":20,"githubRepo":21,"locale":22,"slug":206,"type":23},"skills/firecrawl-search","firecrawl-search",{"evaluate":208,"extract":212},{"promptVersionExtension":26,"promptVersionScoring":27,"score":141,"tags":209,"targetMarket":35,"tier":101},[31,60,21,30,34,210,211],"research","content-analysis",{"commitSha":38,"license":62},{"parentExtensionId":5,"repoId":40},[34,21,211,30,210,60,31],{"evaluatedAt":216,"extractAt":44,"updatedAt":216},1778687239795,{"reviewCount":11},{"description":219,"installMethods":220,"name":20,"sourceUrl":17},"Scrape, search, crawl, and map the web with a single command.",{"claudeCode":20},{"_creationTime":222,"_id":223,"extensionId":5,"locale":22,"result":224,"trustSignals":439,"workflow":458},1778686985594.3503,"kn70g6map572ch4d8jshq5361186mpzk",{"checks":225,"evaluatedAt":411,"extensionSummary":412,"features":413,"nonGoals":422,"promptVersionExtension":26,"promptVersionScoring":27,"purpose":427,"rationale":428,"score":429,"summary":430,"tags":431,"targetMarket":35,"tier":36,"useCases":434},[226,231,234,237,242,245,249,253,256,259,263,267,270,274,277,280,283,286,289,292,296,300,304,308,312,315,318,321,325,328,331,334,337,340,343,347,351,355,358,362,365,368,372,375,378,381,384,387,390,393,397,400,403,407],{"category":227,"check":228,"severity":229,"summary":230},"Practical Utility","Problem relevance","pass","The description \"Scrape, search, crawl, and map the web with a single command\" clearly names the user problem of needing to interact with web content programmatically.",{"category":227,"check":232,"severity":229,"summary":233},"Unique selling proposition","The extension provides a cohesive set of tools for web interaction beyond basic LLM capabilities, offering value through specialized commands like `agent` and `interact`.",{"category":227,"check":235,"severity":229,"summary":236},"Production readiness","The plugin provides a comprehensive suite of tools for web scraping, searching, crawling, and interaction, covering the full lifecycle from discovery to data extraction and manipulation.",{"category":238,"check":239,"severity":240,"summary":241},"Scope","Single responsibility principle","warning","The plugin bundles a wide array of distinct web interaction tools (scrape, search, map, crawl, agent, interact, download, parse, config) which, while related, represent a broad scope that could potentially lead to trigger conflicts or bloat.",{"category":238,"check":243,"severity":229,"summary":244},"Description quality","The displayed description accurately reflects the extension's capabilities by summarizing its core web interaction functions.",{"category":246,"check":247,"severity":229,"summary":248},"Invocation","Scoped tools","The commands are generally scoped to specific web interaction tasks (e.g., `scrape`, `search`, `crawl`), avoiding overly broad or arbitrary command execution.",{"category":250,"check":251,"severity":229,"summary":252},"Documentation","Configuration & parameter reference","The README provides detailed explanations for all commands and their options, including examples and descriptions of formats and usage.",{"category":238,"check":254,"severity":229,"summary":255},"Tool naming","Tool names like `scrape`, `search`, `crawl`, `map`, and `agent` are descriptive and map directly to their intended web interaction functions.",{"category":238,"check":257,"severity":229,"summary":258},"Minimal I/O surface","Commands generally use specific flags and structured inputs (like JSON schemas) or accept direct URLs, and output is configurable to specific formats or files, avoiding excessive diagnostic dumps.",{"category":260,"check":261,"severity":229,"summary":262},"License","License usability","The project is licensed under MIT, a permissive open-source license, clearly indicated in the repository.",{"category":264,"check":265,"severity":229,"summary":266},"Maintenance","Commit recency","The last commit was on May 12, 2026, indicating recent maintenance activity.",{"category":264,"check":268,"severity":229,"summary":269},"Dependency Management","The project uses npm and includes a `package.json`, implying standard dependency management practices, and `npm install` is documented.",{"category":271,"check":272,"severity":229,"summary":273},"Security","Secret Management","Secrets like API keys can be managed via environment variables or per-command arguments, and the `login` command handles authentication securely without echoing keys to stdout. The README mentions `FIRECRAWL_API_KEY` environment variable.",{"category":271,"check":275,"severity":229,"summary":276},"Injection","The CLI primarily takes URLs and structured arguments, and the README does not indicate any mechanisms for executing arbitrary remote code or instructions within fetched content.",{"category":271,"check":278,"severity":229,"summary":279},"Transitive Supply-Chain Grenades","The primary installation method is `npm install -g`, and while it bundles Node.js code, there's no indication of runtime fetching of uncommitted code or scripts.",{"category":271,"check":281,"severity":229,"summary":282},"Sandbox Isolation","The CLI operates on web content and network requests; there's no indication of it modifying files outside of a specified output directory or project scope.",{"category":271,"check":284,"severity":229,"summary":285},"Sandbox escape primitives","No evidence of detached process spawns or denial-retry loops is present in the documentation or common CLI usage patterns.",{"category":271,"check":287,"severity":229,"summary":288},"Data Exfiltration","The extension's primary function is web scraping and interaction; telemetry is opt-out and explicitly states that command data, URLs, or file contents are not collected. API keys are handled via environment variables or secure input.",{"category":271,"check":290,"severity":229,"summary":291},"Hidden Text Tricks","The README and command documentation appear to be free of hidden steering characters or obfuscation techniques.",{"category":293,"check":294,"severity":229,"summary":295},"Hooks","Opaque code execution","The extension is distributed as an npm package, and the documentation does not suggest any obfuscated or dynamically fetched code execution within its core functionality.",{"category":297,"check":298,"severity":229,"summary":299},"Portability","Structural Assumption","The CLI commands operate on URLs or local file paths provided by the user and do not appear to make strong assumptions about specific project directory structures.",{"category":301,"check":302,"severity":240,"summary":303},"Trust","Issues Attention","In the last 90 days, 4 issues were opened and 1 was closed, indicating a low closure rate and potentially slow maintainer response.",{"category":305,"check":306,"severity":229,"summary":307},"Versioning","Release Management","The plugin has a version declared in `plugin.json` (1.0.8) and the `firecrawl --status` command shows the version, indicating clear release management.",{"category":309,"check":310,"severity":229,"summary":311},"Code Execution","Validation","The CLI commands accept specific parameters and flags, and complex operations like scraping and agent execution likely have internal validation, though explicit mention of schema validation libraries is absent.",{"category":271,"check":313,"severity":229,"summary":314},"Unguarded Destructive Operations","The primary operations involve fetching and processing web data, not destructive file operations or infra changes. Any file writes are to specified output paths.",{"category":309,"check":316,"severity":229,"summary":317},"Error Handling","The CLI provides clear error messages for issues like authentication failures and offers command-specific help (`--help`), suggesting reasonable error handling.",{"category":309,"check":319,"severity":229,"summary":320},"Logging","The README mentions saving outputs to `.firecrawl/` and provides examples of using `-o` for file output, implying local audit trails for downloaded content.",{"category":322,"check":323,"severity":229,"summary":324},"Compliance","GDPR","The extension focuses on web scraping and does not inherently handle personal data beyond what is publicly available on websites. Telemetry is opt-out and does not collect PII or command data.",{"category":322,"check":326,"severity":229,"summary":327},"Target market","The extension's functionality is global and not tied to any specific geographic or legal jurisdiction.",{"category":297,"check":329,"severity":229,"summary":330},"Runtime stability","As a Node.js CLI tool, it should be portable across POSIX-compatible systems and Windows, with standard installation via npm.",{"category":250,"check":332,"severity":229,"summary":333},"README","The README is extensive, well-organized, and clearly articulates the extension's purpose, installation, commands, and usage.",{"category":238,"check":335,"severity":240,"summary":336},"Tool surface size","The plugin exposes a large number of distinct commands (scrape, search, map, crawl, agent, interact, download, parse, config, login, logout, credit-usage, view-config, and experimental commands), exceeding the target of 3-10 tools.",{"category":246,"check":338,"severity":240,"summary":339},"Overlapping near-synonym tools","There is potential overlap between `scrape` and `agent` for complex data extraction, and `crawl` and `download` for bulk content retrieval, which might require careful disambiguation by the agent.",{"category":250,"check":341,"severity":229,"summary":342},"Phantom features","All documented commands and features appear to have corresponding implementations detailed in the README and SKILL.md files.",{"category":344,"check":345,"severity":229,"summary":346},"Install","Installation instruction","The README provides clear, copy-pasteable installation instructions for npm and npx, including authentication setup and examples.",{"category":348,"check":349,"severity":229,"summary":350},"Errors","Actionable error messages","The CLI provides helpful error messages for common issues like authentication and suggests next steps or documentation references.",{"category":352,"check":353,"severity":229,"summary":354},"Execution","Pinned dependencies","The extension uses npm, and standard practice implies dependencies are managed, likely with a `package-lock.json` present.",{"category":238,"check":356,"severity":229,"summary":357},"Dry-run preview","While not explicitly a `--dry-run` flag, the `agent` command can be used with `--max-credits` to limit spending, and outputs can be saved to file before processing, allowing for a form of preview.",{"category":359,"check":360,"severity":229,"summary":361},"Protocol","Idempotent retry & timeouts","Options like `--timeout` and `--wait` (with polling) suggest handling for operation completion and potential retries. Web operations are inherently stateful at the server, but the CLI's interface seems designed for stateless client calls.",{"category":322,"check":363,"severity":229,"summary":364},"Telemetry opt-in","The CLI explicitly states telemetry is opt-out via an environment variable (`FIRECRAWL_NO_TELEMETRY=1`) and details what data is collected (version, OS, dev tools), fulfilling the opt-in requirement.",{"category":246,"check":366,"severity":240,"summary":367},"Name collisions","The plugin bundles many commands like `scrape`, `search`, and `crawl` which are common terms and could potentially collide with other CLI tools or built-in agent commands if not managed carefully.",{"category":246,"check":369,"severity":370,"summary":371},"Hooks-off mechanism","not_applicable","This is a CLI plugin, not a Claude Code extension with hooks that need disabling.",{"category":246,"check":373,"severity":370,"summary":374},"Hook matcher tightness","This is a CLI plugin, not a Claude Code extension with hooks that need matcher configuration.",{"category":271,"check":376,"severity":370,"summary":377},"Hook security","This is a CLI plugin, not a Claude Code extension with hooks that require security evaluation.",{"category":293,"check":379,"severity":370,"summary":380},"Silent prompt rewriting","This is a CLI plugin, not a Claude Code extension with hooks that could rewrite prompts.",{"category":271,"check":382,"severity":370,"summary":383},"Permission Hook","This is a CLI plugin, not a Claude Code extension with permission hooks.",{"category":322,"check":385,"severity":370,"summary":386},"Hook privacy","This is a CLI plugin, not a Claude Code extension with hooks that could send data over the network.",{"category":309,"check":388,"severity":370,"summary":389},"Hook dependency","This is a CLI plugin, not a Claude Code extension with hooks that have script dependencies.",{"category":250,"check":391,"severity":229,"summary":392},"Feature Transparency","The README comprehensively describes all commands and experimental features, providing clear explanations of their purpose and usage.",{"category":394,"check":395,"severity":370,"summary":396},"Convention","Layout convention adherence","This is a CLI plugin, not a Claude Code plugin with specific directory layout expectations like `.claude-plugin/`.",{"category":394,"check":398,"severity":370,"summary":399},"Plugin state","This is a CLI plugin, not a Claude Code plugin that persists state under `${CLAUDE_PLUGIN_DATA}`.",{"category":271,"check":401,"severity":229,"summary":402},"Keychain-stored secrets","The README mentions using environment variables (`FIRECRAWL_API_KEY`) or direct API key input, which is a secure way to handle secrets without storing them in plain text configuration files.",{"category":404,"check":405,"severity":229,"summary":406},"Dependencies","Tagged release sourcing","The primary installation method is via npm, which typically points to tagged releases, and the `init` command specifies a version (`@1.16.2`), ensuring pinned dependencies.",{"category":408,"check":409,"severity":229,"summary":410},"Installation","Clean uninstall","Installation is via npm, and uninstalling the global package should remove the CLI. Any locally saved data is handled via the `.firecrawl/` directory, which users can manage themselves.",1778686985460,"This is a Node.js CLI tool that provides a suite of commands for interacting with the web, including searching, scraping, crawling, mapping sites, AI-powered data extraction, and live browser interaction. It supports various output formats and can be configured for self-hosted instances.",[414,415,416,417,418,419,420,421],"Web scraping and content extraction","Web search with optional content scraping","Site mapping and URL discovery","Website crawling for bulk content","AI-powered structured data extraction","Live browser interaction for dynamic pages","Local file parsing (PDF, DOCX, etc.)","Authentication management for Firecrawl API",[423,424,425,426],"Performing local file system operations outside of output saving","Deploying web applications or managing infrastructure","Directly manipulating code or project files","Acting as a general-purpose shell or terminal replacement","To provide a unified command-line interface for a wide range of web data extraction and interaction tasks, empowering users to automate web research and data collection.","The plugin has a broad scope with many distinct commands, and a low issue closure rate. However, it is well-documented, actively maintained, and securely handles secrets and telemetry.",78,"A powerful CLI for comprehensive web interaction, offering broad functionality but with a wide tool surface.",[31,21,34,30,60,432,433],"crawl","agent",[435,436,437,438],"Extracting structured data from complex websites","Automating research and information gathering","Saving website content for offline access","Interacting with web pages that require login or dynamic elements",{"codeQuality":440,"collectedAt":442,"documentation":443,"maintenance":446,"popularity":454,"security":456,"testCoverage":457},{"hasLockfile":441},true,1778686963191,{"descriptionLength":444,"readmeSize":445},61,25712,{"closedIssues90d":447,"forks":448,"hasChangelog":449,"manifestVersion":450,"openIssues90d":451,"pushedAt":452,"stars":453},1,49,false,"1.0.8",4,1778599393000,383,{"npmDownloads":455},51144,{"hasNpmPackage":441,"smitheryVerified":449},{"hasCi":441,"hasTests":441},{"updatedAt":459},1778686985594,{"basePath":461,"githubOwner":20,"githubRepo":21,"locale":22,"slug":21,"type":462},"","plugin",{"_creationTime":464,"_id":465,"community":466,"display":467,"identity":470,"parentExtension":472,"providers":473,"relations":482,"tags":483,"workflow":484},1778686940775.5703,"k17be4khad1zr773wdg2m4yjhn86mjpr",{"reviewCount":11},{"description":468,"installMethods":469,"name":52,"sourceUrl":17},"CLI and Agent Skill for Firecrawl - Add scrape, search, and browsing capabilities to your AI agents",{"claudeCode":15},{"basePath":461,"githubOwner":20,"githubRepo":21,"locale":22,"slug":21,"type":471},"marketplace",null,{"evaluate":474,"extract":478},{"promptVersionExtension":475,"promptVersionScoring":27,"score":141,"tags":476,"targetMarket":35,"tier":101},"3.1.0",[31,21,477,30,60,432,34],"ai-agents",{"commitSha":38,"license":62,"marketplace":479,"plugin":480},{"name":20,"pluginCount":447},{"mcpCount":11,"provider":481,"skillCount":11},"classify",{"repoId":40},[477,34,21,432,30,60,31],{"evaluatedAt":485,"extractAt":44,"updatedAt":485},1778686962954,{"evaluate":487,"extract":489},{"promptVersionExtension":26,"promptVersionScoring":27,"score":429,"tags":488,"targetMarket":35,"tier":36},[31,21,34,30,60,432,433],{"commitSha":38,"plugin":490},{"mcpCount":11,"provider":481,"skillCount":491},9,{"parentExtensionId":465,"repoId":40},{"_creationTime":494,"_id":40,"identity":495,"providers":496,"workflow":647},1778686934511.6108,{"githubOwner":20,"githubRepo":21,"sourceUrl":17},{"classify":497,"discover":639,"extract":642,"github":643,"npm":646},{"commitSha":38,"extensions":498},[499,509,535,541,546,551,556,561,566,571,576,581],{"basePath":461,"displayName":20,"installMethods":500,"rationale":501,"selectedPaths":502,"source":508,"sourceLanguage":22,"type":471},{"claudeCode":15},"marketplace.json at .claude-plugin/marketplace.json",[503,506],{"path":504,"priority":505},".claude-plugin/marketplace.json","mandatory",{"path":507,"priority":505},"README.md","rule",{"basePath":461,"description":219,"displayName":20,"installMethods":510,"rationale":511,"selectedPaths":512,"source":508,"sourceLanguage":22,"type":462},{"claudeCode":20},"plugin manifest at .claude-plugin/plugin.json",[513,515,516,519,521,523,525,527,529,531,533],{"path":514,"priority":505},".claude-plugin/plugin.json",{"path":507,"priority":505},{"path":517,"priority":518},"skills/firecrawl-agent/SKILL.md","medium",{"path":520,"priority":518},"skills/firecrawl-cli/SKILL.md",{"path":522,"priority":518},"skills/firecrawl-crawl/SKILL.md",{"path":524,"priority":518},"skills/firecrawl-download/SKILL.md",{"path":526,"priority":518},"skills/firecrawl-interact/SKILL.md",{"path":528,"priority":518},"skills/firecrawl-map/SKILL.md",{"path":530,"priority":518},"skills/firecrawl-parse/SKILL.md",{"path":532,"priority":518},"skills/firecrawl-scrape/SKILL.md",{"path":534,"priority":518},"skills/firecrawl-search/SKILL.md",{"basePath":19,"description":13,"displayName":16,"installMethods":536,"rationale":537,"selectedPaths":538,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-agent/SKILL.md",[539],{"path":540,"priority":505},"SKILL.md",{"basePath":54,"description":50,"displayName":20,"installMethods":542,"rationale":543,"selectedPaths":544,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-cli/SKILL.md",[545],{"path":540,"priority":505},{"basePath":76,"description":72,"displayName":74,"installMethods":547,"rationale":548,"selectedPaths":549,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-crawl/SKILL.md",[550],{"path":540,"priority":505},{"basePath":94,"description":91,"displayName":95,"installMethods":552,"rationale":553,"selectedPaths":554,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-download/SKILL.md",[555],{"path":540,"priority":505},{"basePath":116,"description":112,"displayName":117,"installMethods":557,"rationale":558,"selectedPaths":559,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-interact/SKILL.md",[560],{"path":540,"priority":505},{"basePath":138,"description":134,"displayName":136,"installMethods":562,"rationale":563,"selectedPaths":564,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-map/SKILL.md",[565],{"path":540,"priority":505},{"basePath":161,"description":157,"displayName":159,"installMethods":567,"rationale":568,"selectedPaths":569,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-parse/SKILL.md",[570],{"path":540,"priority":505},{"basePath":184,"description":181,"displayName":185,"installMethods":572,"rationale":573,"selectedPaths":574,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-scrape/SKILL.md",[575],{"path":540,"priority":505},{"basePath":205,"description":202,"displayName":206,"installMethods":577,"rationale":578,"selectedPaths":579,"source":508,"sourceLanguage":22,"type":23},{"claudeCode":15},"SKILL.md frontmatter at skills/firecrawl-search/SKILL.md",[580],{"path":540,"priority":505},{"basePath":461,"description":582,"displayName":55,"installMethods":583,"license":584,"rationale":585,"selectedPaths":586,"source":508,"sourceLanguage":22,"type":21},"Command-line interface for Firecrawl. Scrape, crawl, and extract data from any website directly from your terminal.",{"npm":55},"ISC","cli ecosystem detected at /",[587,589,590,593,595,597,599,601,603,605,607,609,611,613,615,617,619,621,623,625,627,629,631,633,635,637],{"path":588,"priority":505},"package.json",{"path":507,"priority":505},{"path":591,"priority":592},"src/index.ts","low",{"path":594,"priority":592},"src/commands/agent.ts",{"path":596,"priority":592},"src/commands/browser.ts",{"path":598,"priority":592},"src/commands/config.ts",{"path":600,"priority":592},"src/commands/crawl.ts",{"path":602,"priority":592},"src/commands/create.ts",{"path":604,"priority":592},"src/commands/credit-usage.ts",{"path":606,"priority":592},"src/commands/env.ts",{"path":608,"priority":592},"src/commands/experimental/backends.ts",{"path":610,"priority":592},"src/commands/experimental/index.ts",{"path":612,"priority":592},"src/commands/experimental/shared.ts",{"path":614,"priority":592},"src/commands/init.ts",{"path":616,"priority":592},"src/commands/interact.ts",{"path":618,"priority":592},"src/commands/login.ts",{"path":620,"priority":592},"src/commands/logout.ts",{"path":622,"priority":592},"src/commands/map.ts",{"path":624,"priority":592},"src/commands/parse.ts",{"path":626,"priority":592},"src/commands/scrape.ts",{"path":628,"priority":592},"src/commands/search.ts",{"path":630,"priority":592},"src/commands/setup.ts",{"path":632,"priority":592},"src/commands/skills-install.ts",{"path":634,"priority":592},"src/commands/skills-native.ts",{"path":636,"priority":592},"src/commands/status.ts",{"path":638,"priority":592},"src/commands/version.ts",{"sources":640},[641],"manual",{"npmPackage":55},{"closedIssues90d":447,"description":468,"forks":448,"homepage":644,"openIssues90d":451,"pushedAt":452,"readmeSize":445,"stars":453,"topics":645},"http://docs.firecrawl.dev/cli",[],{"downloads":455},{"classifiedAt":648,"discoverAt":649,"extractAt":650,"githubAt":650,"npmAt":651,"updatedAt":648},1778686940560,1778686934511,1778686936677,1778686938425,[433,34,21,432,30,60,31],{"evaluatedAt":459,"extractAt":44,"updatedAt":459},[],[656,687,715],{"_creationTime":657,"_id":658,"community":659,"display":660,"identity":665,"providers":670,"relations":681,"tags":683,"workflow":684},1778685949178.7886,"k175j0a2ttdtwfrzvz3gae0z2186njwq",{"reviewCount":11},{"description":661,"installMethods":662,"name":663,"sourceUrl":664},"SDD WORK-PIPELINE Agent — Requirements analysis & development 6-agent full pipeline with DAG-based orchestration and sliding window context management",{"claudeCode":663},"uc-taskmanager","https://github.com/davepoon/buildwithclaude",{"basePath":666,"githubOwner":667,"githubRepo":668,"locale":22,"slug":669,"type":462},"plugins/agents-uc-taskmanager","davepoon","buildwithclaude","agents-uc-taskmanager",{"evaluate":671,"extract":677},{"promptVersionExtension":26,"promptVersionScoring":27,"score":672,"tags":673,"targetMarket":35,"tier":101},100,[34,674,675,676,433],"development","pipeline","sdd",{"commitSha":38,"license":678,"plugin":679},"GPL-3.0",{"mcpCount":11,"provider":481,"skillCount":680},3,{"repoId":682},"kd719kw54vhmcscq7ckdp59fg586mnt6",[433,34,674,675,676],{"evaluatedAt":685,"extractAt":686,"updatedAt":685},1778687422231,1778685949178,{"_creationTime":688,"_id":689,"community":690,"display":691,"identity":696,"providers":698,"relations":707,"tags":710,"workflow":711},1778699316533.7866,"k17d3jtp70vmbqjhnze3n53ra586n5r8",{"reviewCount":11},{"description":692,"installMethods":693,"name":694,"sourceUrl":695},"Search academic papers via OpenAlex — find papers by keyword, look up details by DOI, with pagination and sorting",{"claudeCode":694},"paper-search","https://github.com/ykdojo/paper-search",{"basePath":461,"githubOwner":697,"githubRepo":694,"locale":22,"slug":694,"type":462},"ykdojo",{"evaluate":699,"extract":705},{"promptVersionExtension":26,"promptVersionScoring":27,"score":672,"tags":700,"targetMarket":35,"tier":101},[701,60,702,703,210,704],"academic","papers","openalex","citations",{"commitSha":38,"license":62,"plugin":706},{"mcpCount":11,"provider":481,"skillCount":447},{"parentExtensionId":708,"repoId":709},"k17abfkyvjasac4fgc8v24wz6186mvem","kd78zpgf1ptwq5s0gcz3yqr9n186mvy5",[701,704,703,702,210,60],{"evaluatedAt":712,"extractAt":713,"updatedAt":714},1778699343032,1778699316533,1778699386711,{"_creationTime":716,"_id":717,"community":718,"display":719,"identity":724,"providers":726,"relations":738,"tags":741,"workflow":742},1778699170774.159,"k17axvhmvwp90strpqcd5b0h7986m80d",{"reviewCount":11},{"description":720,"installMethods":721,"name":722,"sourceUrl":723},"X (Twitter) real-time data platform skill with REST API (100+ endpoints), MCP server (2 tools) & webhooks. Covers tweet search, user lookup, timelines, extraction, monitoring, giveaway draws, credits, support, and confirmation-gated private reads, write actions, webhooks, monitors, and pay-per-use flows. Reads from $0.00015/call.",{"claudeCode":722},"x-twitter-scraper","https://github.com/Xquik-dev/x-twitter-scraper",{"basePath":461,"githubOwner":725,"githubRepo":722,"locale":22,"slug":722,"type":462},"Xquik-dev",{"evaluate":727,"extract":736},{"promptVersionExtension":26,"promptVersionScoring":27,"score":164,"tags":728,"targetMarket":35,"tier":101},[729,730,731,732,30,733,734,735],"x","twitter","api","scraper","mcp-server","monitoring","webhooks",{"commitSha":38,"license":62,"plugin":737},{"mcpCount":11,"provider":481,"skillCount":447},{"parentExtensionId":739,"repoId":740},"k17df5mxb3839qe7nbg1y0hy5986nfbq","kd783enpnwhry153ka0z65ear186mjbh",[731,30,733,734,732,730,735,729],{"evaluatedAt":743,"extractAt":744,"updatedAt":745},1778699215383,1778699170774,1778699295835]