[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"extension-plugin-huggingface-huggingface-vision-trainer-en":3,"guides-for-huggingface-huggingface-vision-trainer":751,"similar-k179t4q3q7ywr9f9ka3wxrmbxs86m34d-en":752},{"_creationTime":4,"_id":5,"children":6,"community":7,"display":9,"evaluation":14,"identity":258,"isFallback":241,"parentExtension":263,"providers":298,"relations":302,"repo":303,"tags":749,"workflow":750},1778690773482.4856,"k179t4q3q7ywr9f9ka3wxrmbxs86m34d",[],{"reviewCount":8},0,{"description":10,"installMethods":11,"name":12,"sourceUrl":13},"Train and fine-tune object detection models (RTDETRv2, YOLOS, DETR and others) and image classification models (timm and transformers models — MobileNetV3, MobileViT, ResNet, ViT/DINOv3) using Transformers Trainer API on Hugging Face Jobs infrastructure or locally. Includes COCO dataset format support, Albumentations augmentation, mAP/mAR metrics, trackio tracking, hardware selection, and Hub persistence.",{"claudeCode":12},"huggingface-vision-trainer","https://github.com/huggingface/skills",{"_creationTime":15,"_id":16,"extensionId":5,"locale":17,"result":18,"trustSignals":239,"workflow":256},1778691153160.2527,"kn7e1zh3v42rjy86df10nwyrmx86md8r","en",{"checks":19,"evaluatedAt":204,"extensionSummary":205,"features":206,"nonGoals":213,"promptVersionExtension":217,"promptVersionScoring":218,"purpose":219,"rationale":220,"score":221,"summary":222,"tags":223,"targetMarket":232,"tier":233,"useCases":234},[20,25,28,31,35,38,43,47,50,53,57,61,64,68,71,74,77,80,83,86,90,94,98,102,106,109,112,115,119,122,125,128,131,134,137,141,145,149,152,156,159,162,165,168,171,174,177,180,183,186,190,193,196,200],{"category":21,"check":22,"severity":23,"summary":24},"Practical Utility","Problem relevance","pass","The description clearly states the problem of training and fine-tuning object detection and image classification models, and addresses the need for a robust workflow with specific datasets, metrics, and infrastructure.",{"category":21,"check":26,"severity":23,"summary":27},"Unique selling proposition","This plugin offers significant value beyond basic model training by integrating with Hugging Face Jobs, supporting various model architectures and datasets, and automating complex tasks like dataset validation and Hub persistence.",{"category":21,"check":29,"severity":23,"summary":30},"Production readiness","The plugin provides a complete lifecycle for vision model training, from dataset validation and hardware selection to training execution on Hugging Face Jobs and Hub persistence, making it suitable for production workflows.",{"category":32,"check":33,"severity":23,"summary":34},"Scope","Single responsibility principle","The plugin focuses specifically on training and fine-tuning vision models (object detection, image classification, segmentation) using Hugging Face infrastructure, maintaining a coherent domain.",{"category":32,"check":36,"severity":23,"summary":37},"Description quality","The displayed description accurately and concisely reflects the capabilities of the plugin, detailing the supported models, tasks, infrastructure, and features.",{"category":39,"check":40,"severity":41,"summary":42},"Invocation","Scoped tools","not_applicable","This is a plugin and does not expose individual tools in the same way a skill does; its functionality is accessed via the plugin's overall interface.",{"category":44,"check":45,"severity":23,"summary":46},"Documentation","Configuration & parameter reference","The SKILL.md file provides detailed `script_args` for different training types and clearly outlines required flags, options, and their meanings, including boolean syntax.",{"category":32,"check":48,"severity":41,"summary":49},"Tool naming","As a plugin, it does not expose individual tools with names to evaluate for this check.",{"category":32,"check":51,"severity":41,"summary":52},"Minimal I/O surface","This check applies to individual tools; the plugin's I/O is managed by the host agent and its integration points.",{"category":54,"check":55,"severity":23,"summary":56},"License","License usability","The plugin is licensed under the Apache-2.0 license, as indicated by the bundled LICENSE file, which is a permissive open-source license.",{"category":58,"check":59,"severity":23,"summary":60},"Maintenance","Commit recency","The repository shows recent commits, with the latest push at 2026-05-12, indicating active maintenance.",{"category":58,"check":62,"severity":23,"summary":63},"Dependency Management","The use of PEP 723 inline dependencies and `uv run` suggests a modern and manageable approach to dependency handling for the scripts.",{"category":65,"check":66,"severity":23,"summary":67},"Security","Secret Management","Secrets (like HF_TOKEN) are explicitly handled via job secrets and injected into the training script's environment variables, with clear documentation on usage.",{"category":65,"check":69,"severity":23,"summary":70},"Injection","The scripts are designed to run Python code, and the documentation emphasizes validating datasets and using specific scripts, implying data is treated as input rather than instructions.",{"category":65,"check":72,"severity":23,"summary":73},"Transitive Supply-Chain Grenades","The scripts rely on bundled Python code and well-defined inputs (datasets, model names); there's no evidence of runtime downloads of arbitrary code or instructions.",{"category":65,"check":75,"severity":23,"summary":76},"Sandbox Isolation","Training jobs run on Hugging Face Jobs infrastructure, which provides sandboxed environments. The scripts focus on model training and Hub interaction, not modifying arbitrary file paths outside their scope.",{"category":65,"check":78,"severity":23,"summary":79},"Sandbox escape primitives","The scripts are Python-based training routines executed within a managed job environment, with no indication of detached process spawns or escape primitives.",{"category":65,"check":81,"severity":23,"summary":82},"Data Exfiltration","Outbound calls are limited to interacting with the Hugging Face Hub for model/dataset access and persistence, which is documented and essential for the skill's function. No evidence of unauthorized exfiltration.",{"category":65,"check":84,"severity":23,"summary":85},"Hidden Text Tricks","The bundled scripts and documentation appear to be standard, readable Python and Markdown, with no hidden steering tricks or obfuscation detected.",{"category":87,"check":88,"severity":23,"summary":89},"Hooks","Opaque code execution","The core functionality relies on standard Python scripts and Hugging Face libraries, with no evidence of obfuscated code, base64 payloads, or runtime fetched scripts.",{"category":91,"check":92,"severity":23,"summary":93},"Portability","Structural Assumption","The scripts operate on Hugging Face Hub datasets and models, and their output is directed to the Hub or a specified output directory, minimizing assumptions about user project structure.",{"category":95,"check":96,"severity":23,"summary":97},"Trust","Issues Attention","In the last 90 days, 4 issues were opened and 6 closed, indicating active engagement and a healthy closure rate.",{"category":99,"check":100,"severity":23,"summary":101},"Versioning","Release Management","The repository is actively maintained with recent commits, and the use of `main` branch for installation with an explicit `huggingface/skills` source suggests a versioning strategy is in place.",{"category":103,"check":104,"severity":23,"summary":105},"Code Execution","Validation","The use of `HfArgumentParser` for CLI arguments and the explicit dataset validation step suggest robust input handling and validation practices.",{"category":65,"check":107,"severity":23,"summary":108},"Unguarded Destructive Operations","The primary operations are model training and uploading to the Hub, which are controlled by user intent and authentication. No destructive operations like arbitrary file deletion or system changes are evident.",{"category":103,"check":110,"severity":23,"summary":111},"Error Handling","The scripts utilize standard Python error handling and `HfArgumentParser`, and the documentation details common failure modes and troubleshooting steps, indicating good error reporting.",{"category":103,"check":113,"severity":23,"summary":114},"Logging","The use of Hugging Face Jobs and Trackio implies structured logging of training progress and outcomes. The documentation also references local script execution and job logs.",{"category":116,"check":117,"severity":41,"summary":118},"Compliance","GDPR","The plugin focuses on training ML models and interacting with the Hugging Face Hub. It does not appear to handle personal data beyond user authentication tokens.",{"category":116,"check":120,"severity":23,"summary":121},"Target market","The extension is designed for general machine learning tasks and Hugging Face Hub integration, with no regional restrictions detected. `targetMarket` is set to 'global'.",{"category":91,"check":123,"severity":23,"summary":124},"Runtime stability","The scripts are Python-based and designed to run on Hugging Face Jobs infrastructure, which abstracts away OS and environment differences. The use of `uv run` for local execution also promotes portability.",{"category":44,"check":126,"severity":23,"summary":127},"README","The README provides a comprehensive overview of Hugging Face Skills, installation instructions for various agents, and a detailed list of available skills, including the one being evaluated.",{"category":32,"check":129,"severity":41,"summary":130},"Tool surface size","This is a plugin, not a skill with a defined set of tools exposed to the agent in the same way.",{"category":39,"check":132,"severity":41,"summary":133},"Overlapping near-synonym tools","As a plugin, it does not expose individual tools with overlapping functionality to evaluate.",{"category":44,"check":135,"severity":23,"summary":136},"Phantom features","All advertised features, such as support for specific model architectures, dataset formats, and Hugging Face Jobs integration, are clearly implemented and documented in the SKILL.md.",{"category":138,"check":139,"severity":23,"summary":140},"Install","Installation instruction","The README provides clear, copy-pasteable installation instructions for multiple agents (Claude Code, Codex, Gemini CLI, Cursor) and includes example invocations.",{"category":142,"check":143,"severity":23,"summary":144},"Errors","Actionable error messages","The documentation proactively addresses common failure modes with troubleshooting steps, and the use of `HfArgumentParser` ensures structured arguments, leading to potentially actionable error messages.",{"category":146,"check":147,"severity":23,"summary":148},"Execution","Pinned dependencies","The use of PEP 723 inline dependencies managed by `uv` suggests that specific interpreter versions and dependencies are declared and managed.",{"category":32,"check":150,"severity":41,"summary":151},"Dry-run preview","Model training inherently involves side effects like GPU computation and Hub uploads. While a dry-run for the entire process is not feasible, the process is user-initiated and controlled.",{"category":153,"check":154,"severity":23,"summary":155},"Protocol","Idempotent retry & timeouts","The use of Hugging Face Jobs with configurable timeouts and the idempotent nature of uploading to the Hub (models are versioned) suggest good retry and timeout handling. Training jobs themselves are not typically idempotent but have defined end states.",{"category":116,"check":157,"severity":23,"summary":158},"Telemetry opt-in","The plugin utilizes Trackio for monitoring, which is typically opt-in or configured by the user via Hugging Face Hub. The core training process does not appear to have undisclosed telemetry.",{"category":39,"check":160,"severity":41,"summary":161},"Name collisions","This is a plugin and does not expose individual extension names that could collide.",{"category":39,"check":163,"severity":41,"summary":164},"Hooks-off mechanism","The plugin does not appear to expose or manage hooks in a way that would require a separate hooks-off mechanism.",{"category":39,"check":166,"severity":41,"summary":167},"Hook matcher tightness","This plugin does not appear to define or manage hooks directly.",{"category":65,"check":169,"severity":41,"summary":170},"Hook security","The plugin does not appear to utilize hooks that perform destructive or network-touching operations.",{"category":87,"check":172,"severity":41,"summary":173},"Silent prompt rewriting","There are no `UserPromptSubmit` hooks detected that would rewrite prompts.",{"category":65,"check":175,"severity":41,"summary":176},"Permission Hook","No `PermissionRequest` hooks are present in this plugin.",{"category":116,"check":178,"severity":41,"summary":179},"Hook privacy","The plugin does not appear to use hooks for logging or telemetry that send data over the network.",{"category":103,"check":181,"severity":41,"summary":182},"Hook dependency","No custom hooks are bundled with this plugin.",{"category":44,"check":184,"severity":23,"summary":185},"Feature Transparency","The README and SKILL.md clearly describe the plugin's functionality, supported models, and how to use it with various agents.",{"category":187,"check":188,"severity":41,"summary":189},"Convention","Layout convention adherence","This is a plugin registered via marketplace URL, not a local checkout following specific file layout conventions within a repository.",{"category":187,"check":191,"severity":41,"summary":192},"Plugin state","The plugin does not appear to manage persistent state beyond temporary job artifacts on Hugging Face Jobs.",{"category":65,"check":194,"severity":41,"summary":195},"Keychain-stored secrets","Secrets are handled via Hugging Face Jobs' secret injection mechanism, not through user-specific keychain storage managed by the plugin.",{"category":197,"check":198,"severity":23,"summary":199},"Dependencies","Tagged release sourcing","The plugin is registered via a GitHub repository URL pointing to a specific commit or branch, and the underlying scripts use PEP 723 for their own dependencies, indicating a versioned sourcing approach.",{"category":201,"check":202,"severity":23,"summary":203},"Installation","Clean uninstall","The plugin primarily interacts with Hugging Face Jobs and the Hub. Any job artifacts are temporary, and uninstalling the plugin should not leave background daemons or persistent system changes.",1778691153030,"This plugin enables training and fine-tuning of various object detection, image classification, and segmentation models using the Hugging Face Transformers library on Hugging Face Jobs. It supports custom datasets, multiple model architectures, and automates many aspects of the training workflow, including dataset validation and Hub persistence.",[207,208,209,210,211,212],"Train object detection models (RTDETRv2, YOLOS, DETR)","Train image classification models (timm, transformers)","Train SAM/SAM2 segmentation models","Support for COCO dataset format and Albumentations augmentation","Integration with Hugging Face Jobs for cloud GPU training","Automated dataset validation and Hub persistence",[214,215,216],"Running training jobs on local hardware (though scripts can be run locally for inspection).","Providing a graphical user interface for model training.","Managing or providing datasets; users must supply their own datasets on the Hub.","3.0.0","4.4.0","To provide a seamless and powerful way for users to train and fine-tune computer vision models without managing local GPU infrastructure, leveraging Hugging Face's cloud capabilities.","High quality plugin with excellent documentation and robust implementation. Minor areas for improvement relate to plugin-specific conventions not applicable here, and the inherent nature of cloud-based training jobs.",96,"A robust and well-documented plugin for training and fine-tuning vision models on Hugging Face Jobs infrastructure.",[224,225,226,227,228,229,230,231],"machine-learning","computer-vision","object-detection","image-classification","segmentation","hugging-face","transformers","python","global","verified",[235,236,237,238],"Fine-tuning object detection models on custom datasets.","Training image classification models for specific tasks.","Experimenting with SAM/SAM2 models for segmentation on new data.","Leveraging cloud GPUs for computationally intensive vision model training.",{"codeQuality":240,"collectedAt":242,"documentation":243,"maintenance":246,"security":252,"testCoverage":254},{"hasLockfile":241},false,1778691121180,{"descriptionLength":244,"readmeSize":245},408,9821,{"closedIssues90d":247,"forks":248,"hasChangelog":241,"openIssues90d":249,"pushedAt":250,"stars":251},6,663,4,1778593131000,10482,{"hasNpmPackage":241,"license":253,"smitheryVerified":241},"Apache-2.0",{"hasCi":255,"hasTests":241},true,{"updatedAt":257},1778691153160,{"basePath":259,"githubOwner":260,"githubRepo":261,"locale":17,"slug":12,"type":262},"skills/huggingface-vision-trainer","huggingface","skills","plugin",{"_creationTime":264,"_id":265,"community":266,"display":267,"identity":272,"parentExtension":275,"providers":276,"relations":292,"tags":294,"workflow":295},1778690773482.4824,"k17es3r8wd37t5rrwqcpp5kwrh86mxx8",{"reviewCount":8},{"description":268,"installMethods":269,"name":271,"sourceUrl":13},"Agent Skills for AI/ML tasks including dataset creation, model training, evaluation, and research paper publishing on Hugging Face Hub",{"claudeCode":270},"huggingface/skills","huggingface-skills",{"basePath":273,"githubOwner":260,"githubRepo":261,"locale":17,"slug":261,"type":274},"","marketplace",null,{"evaluate":277,"extract":286},{"promptVersionExtension":278,"promptVersionScoring":218,"score":279,"tags":280,"targetMarket":232,"tier":233},"3.1.0",95,[281,260,282,283,284,285],"ai-ml","datasets","models","research","developer-tools",{"commitSha":287,"marketplace":288,"plugin":290},"HEAD",{"name":271,"pluginCount":289},14,{"mcpCount":8,"provider":291,"skillCount":8},"classify",{"repoId":293},"kd72xwt5xnc0ktc4p7smzfcp3986m959",[281,282,285,260,283,284],{"evaluatedAt":296,"extractAt":297,"updatedAt":296},1778690814090,1778690773482,{"evaluate":299,"extract":301},{"promptVersionExtension":217,"promptVersionScoring":218,"score":221,"tags":300,"targetMarket":232,"tier":233},[224,225,226,227,228,229,230,231],{"commitSha":287},{"parentExtensionId":265,"repoId":293},{"_creationTime":304,"_id":293,"identity":305,"providers":306,"workflow":745},1778689536128.5474,{"githubOwner":260,"githubRepo":261,"sourceUrl":13},{"classify":307,"discover":738,"github":741},{"commitSha":287,"extensions":308},[309,322,331,339,347,355,363,371,379,387,395,403,411,419,424,432,475,484,490,496,513,519,526,568,579,598,604,624,636,660,718],{"basePath":273,"description":268,"displayName":271,"installMethods":310,"rationale":311,"selectedPaths":312,"source":321,"sourceLanguage":17,"type":274},{"claudeCode":270},"marketplace.json at .claude-plugin/marketplace.json",[313,316,318],{"path":314,"priority":315},".claude-plugin/marketplace.json","mandatory",{"path":317,"priority":315},"README.md",{"path":319,"priority":320},"LICENSE","high","rule",{"basePath":323,"description":324,"displayName":325,"installMethods":326,"rationale":327,"selectedPaths":328,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-llm-trainer","Train or fine-tune language models using TRL on Hugging Face Jobs infrastructure. Covers SFT, DPO, GRPO and reward modeling training methods, plus GGUF conversion for local deployment. Includes hardware selection, cost estimation, Trackio monitoring, and Hub persistence.","huggingface-llm-trainer",{"claudeCode":325},"inline plugin source from marketplace.json at skills/huggingface-llm-trainer",[329],{"path":330,"priority":320},"SKILL.md",{"basePath":332,"description":333,"displayName":334,"installMethods":335,"rationale":336,"selectedPaths":337,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-local-models","Use to select models to run locally with llama.cpp and GGUF on CPU, Mac Metal, CUDA, or ROCm. Covers finding GGUFs, quant selection, running servers, exact GGUF file lookup, conversion, and OpenAI-compatible local serving.","huggingface-local-models",{"claudeCode":334},"inline plugin source from marketplace.json at skills/huggingface-local-models",[338],{"path":330,"priority":320},{"basePath":340,"description":341,"displayName":342,"installMethods":343,"rationale":344,"selectedPaths":345,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-paper-publisher","Publish and manage research papers on Hugging Face Hub. Supports creating paper pages, linking papers to models/datasets, claiming authorship, and generating professional markdown-based research articles.","huggingface-paper-publisher",{"claudeCode":342},"inline plugin source from marketplace.json at skills/huggingface-paper-publisher",[346],{"path":330,"priority":320},{"basePath":348,"description":349,"displayName":350,"installMethods":351,"rationale":352,"selectedPaths":353,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-papers","Look up and read Hugging Face paper pages in markdown, and use the papers API for structured metadata like authors, linked models, datasets, Spaces, and media URLs when needed.","huggingface-papers",{"claudeCode":350},"inline plugin source from marketplace.json at skills/huggingface-papers",[354],{"path":330,"priority":320},{"basePath":356,"description":357,"displayName":358,"installMethods":359,"rationale":360,"selectedPaths":361,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-community-evals","Add and manage evaluation results in Hugging Face model cards. Supports extracting eval tables from README content, importing scores from Artificial Analysis API, and running custom evaluations with vLLM/lighteval.","huggingface-community-evals",{"claudeCode":358},"inline plugin source from marketplace.json at skills/huggingface-community-evals",[362],{"path":330,"priority":320},{"basePath":364,"description":365,"displayName":366,"installMethods":367,"rationale":368,"selectedPaths":369,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-best","Find the best AI model for any task by querying Hugging Face leaderboards and benchmarks. Recommends top models based on task type, hardware constraints, and benchmark scores.","huggingface-best",{"claudeCode":366},"inline plugin source from marketplace.json at skills/huggingface-best",[370],{"path":330,"priority":320},{"basePath":372,"description":373,"displayName":374,"installMethods":375,"rationale":376,"selectedPaths":377,"source":321,"sourceLanguage":17,"type":262},"skills/hf-cli","Execute Hugging Face Hub operations using the hf CLI. Download models/datasets, upload files, manage repos, and run cloud compute jobs.","hf-cli",{"claudeCode":374},"inline plugin source from marketplace.json at skills/hf-cli",[378],{"path":330,"priority":320},{"basePath":380,"description":381,"displayName":382,"installMethods":383,"rationale":384,"selectedPaths":385,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-trackio","Track and visualize ML training experiments with Trackio. Log metrics via Python API and retrieve them via CLI. Supports real-time dashboards synced to HF Spaces.","huggingface-trackio",{"claudeCode":382},"inline plugin source from marketplace.json at skills/huggingface-trackio",[386],{"path":330,"priority":320},{"basePath":388,"description":389,"displayName":390,"installMethods":391,"rationale":392,"selectedPaths":393,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-datasets","Explore, query, and extract data from any Hugging Face dataset using the Dataset Viewer REST API and npx tooling. Zero Python dependencies — covers split/config discovery, row pagination, text search, filtering, SQL via parquetlens, and dataset upload via CLI.","huggingface-datasets",{"claudeCode":390},"inline plugin source from marketplace.json at skills/huggingface-datasets",[394],{"path":330,"priority":320},{"basePath":396,"description":397,"displayName":398,"installMethods":399,"rationale":400,"selectedPaths":401,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-tool-builder","Build reusable scripts for Hugging Face Hub and API workflows. Useful for chaining API calls, enriching Hub metadata, or automating repeated tasks.","huggingface-tool-builder",{"claudeCode":398},"inline plugin source from marketplace.json at skills/huggingface-tool-builder",[402],{"path":330,"priority":320},{"basePath":404,"description":405,"displayName":406,"installMethods":407,"rationale":408,"selectedPaths":409,"source":321,"sourceLanguage":17,"type":262},"skills/huggingface-gradio","Build Gradio web UIs and demos in Python. Use when creating or editing Gradio apps, components, event listeners, layouts, or chatbots.","huggingface-gradio",{"claudeCode":406},"inline plugin source from marketplace.json at skills/huggingface-gradio",[410],{"path":330,"priority":320},{"basePath":412,"description":413,"displayName":414,"installMethods":415,"rationale":416,"selectedPaths":417,"source":321,"sourceLanguage":17,"type":262},"skills/transformers-js","Run state-of-the-art machine learning models directly in JavaScript/TypeScript for NLP, computer vision, audio processing, and multimodal tasks. Works in Node.js and browsers with WebGPU/WASM using Hugging Face models.","transformers-js",{"claudeCode":414},"inline plugin source from marketplace.json at skills/transformers-js",[418],{"path":330,"priority":320},{"basePath":259,"description":10,"displayName":12,"installMethods":420,"rationale":421,"selectedPaths":422,"source":321,"sourceLanguage":17,"type":262},{"claudeCode":12},"inline plugin source from marketplace.json at skills/huggingface-vision-trainer",[423],{"path":330,"priority":320},{"basePath":425,"description":426,"displayName":427,"installMethods":428,"rationale":429,"selectedPaths":430,"source":321,"sourceLanguage":17,"type":262},"skills/train-sentence-transformers","Train or fine-tune sentence-transformers models across all three architectures: SentenceTransformer (bi-encoder embeddings), CrossEncoder (rerankers), and SparseEncoder (SPLADE). Covers loss selection, hard-negative mining, evaluators, distillation, LoRA, Matryoshka, and Hugging Face Hub publishing.","train-sentence-transformers",{"claudeCode":427},"inline plugin source from marketplace.json at skills/train-sentence-transformers",[431],{"path":330,"priority":320},{"basePath":273,"description":268,"displayName":271,"installMethods":433,"license":253,"rationale":434,"selectedPaths":435,"source":321,"sourceLanguage":17,"type":262},{"claudeCode":271},"plugin manifest at .claude-plugin/plugin.json",[436,438,439,440,443,445,447,449,451,453,455,457,459,461,463,465,467,469,471,473],{"path":437,"priority":315},".claude-plugin/plugin.json",{"path":317,"priority":315},{"path":319,"priority":320},{"path":441,"priority":442},"skills/hf-cli/SKILL.md","medium",{"path":444,"priority":442},"skills/huggingface-best/SKILL.md",{"path":446,"priority":442},"skills/huggingface-community-evals/SKILL.md",{"path":448,"priority":442},"skills/huggingface-datasets/SKILL.md",{"path":450,"priority":442},"skills/huggingface-gradio/SKILL.md",{"path":452,"priority":442},"skills/huggingface-llm-trainer/SKILL.md",{"path":454,"priority":442},"skills/huggingface-local-models/SKILL.md",{"path":456,"priority":442},"skills/huggingface-paper-publisher/SKILL.md",{"path":458,"priority":442},"skills/huggingface-papers/SKILL.md",{"path":460,"priority":442},"skills/huggingface-tool-builder/SKILL.md",{"path":462,"priority":442},"skills/huggingface-trackio/SKILL.md",{"path":464,"priority":442},"skills/huggingface-vision-trainer/SKILL.md",{"path":466,"priority":442},"skills/train-sentence-transformers/SKILL.md",{"path":468,"priority":442},"skills/transformers-js/SKILL.md",{"path":470,"priority":315},".mcp.json",{"path":472,"priority":320},"agents/AGENTS.md",{"path":474,"priority":320},".cursor-plugin/plugin.json",{"basePath":476,"description":477,"displayName":478,"installMethods":479,"rationale":480,"selectedPaths":481,"source":321,"sourceLanguage":17,"type":483},"hf-mcp/skills/hf-mcp","Use Hugging Face Hub via MCP server tools. Search models, datasets, Spaces, papers. Get repo details, fetch documentation, run compute jobs, and use Gradio Spaces as AI tools. Available when connected to the HF MCP server.","hf-mcp",{"claudeCode":270},"SKILL.md frontmatter at hf-mcp/skills/hf-mcp/SKILL.md",[482],{"path":330,"priority":315},"skill",{"basePath":372,"description":485,"displayName":374,"installMethods":486,"rationale":487,"selectedPaths":488,"source":321,"sourceLanguage":17,"type":483},"Hugging Face Hub CLI (`hf`) for downloading, uploading, and managing models, datasets, spaces, buckets, repos, papers, jobs, and more on the Hugging Face Hub. Use when: handling authentication; managing local cache; managing Hugging Face Buckets; running or scheduling jobs on Hugging Face infrastructure; managing Hugging Face repos; discussions and pull requests; browsing models, datasets and spaces; reading, searching, or browsing academic papers; managing collections; querying datasets; configuring spaces; setting up webhooks; or deploying and managing HF Inference Endpoints. Make sure to use this skill whenever the user mentions 'hf', 'huggingface', 'Hugging Face', 'huggingface-cli', or 'hugging face cli', or wants to do anything related to the Hugging Face ecosystem and to AI and ML in general. Also use for cloud storage needs like training checkpoints, data pipelines, or agent traces. Use even if the user doesn't explicitly ask for a CLI command. Replaces the deprecated `huggingface-cli`.",{"claudeCode":270},"SKILL.md frontmatter at skills/hf-cli/SKILL.md",[489],{"path":330,"priority":315},{"basePath":364,"description":491,"displayName":366,"installMethods":492,"rationale":493,"selectedPaths":494,"source":321,"sourceLanguage":17,"type":483},"Use when the user asks about finding the best, top, or recommended model for a task, wants to know what AI model to use, or wants to compare models by benchmark scores. Triggers on: \"best model for X\", \"what model should I use for\", \"top models for [task]\", \"which model runs on my laptop/machine/device\", \"recommend a model for\", \"what LLM should I use for\", \"compare models for\", \"what's state of the art for\", or any question about choosing an AI model for a specific use case. Always use this skill when the user wants model recommendations or comparisons, even if they don't explicitly mention HuggingFace or benchmarks.\n",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-best/SKILL.md",[495],{"path":330,"priority":315},{"basePath":356,"description":497,"displayName":358,"installMethods":498,"rationale":499,"selectedPaths":500,"source":321,"sourceLanguage":17,"type":483},"Run evaluations for Hugging Face Hub models using inspect-ai and lighteval on local hardware. Use for backend selection, local GPU evals, and choosing between vLLM / Transformers / accelerate. Not for HF Jobs orchestration, model-card PRs, .eval_results publication, or community-evals automation.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-community-evals/SKILL.md",[501,502,505,507,509,511],{"path":330,"priority":315},{"path":503,"priority":504},"examples/.env.example","low",{"path":506,"priority":504},"examples/USAGE_EXAMPLES.md",{"path":508,"priority":504},"scripts/inspect_eval_uv.py",{"path":510,"priority":504},"scripts/inspect_vllm_uv.py",{"path":512,"priority":504},"scripts/lighteval_vllm_uv.py",{"basePath":388,"description":514,"displayName":390,"installMethods":515,"rationale":516,"selectedPaths":517,"source":321,"sourceLanguage":17,"type":483},"Use this skill for Hugging Face Dataset Viewer API workflows that fetch subset/split metadata, paginate rows, search text, apply filters, download parquet URLs, and read size or statistics.\r",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-datasets/SKILL.md",[518],{"path":330,"priority":315},{"basePath":404,"description":405,"displayName":406,"installMethods":520,"rationale":521,"selectedPaths":522,"source":321,"sourceLanguage":17,"type":483},{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-gradio/SKILL.md",[523,524],{"path":330,"priority":315},{"path":525,"priority":442},"examples.md",{"basePath":323,"description":527,"displayName":325,"installMethods":528,"rationale":529,"selectedPaths":530,"source":321,"sourceLanguage":17,"type":483},"Train or fine-tune language and vision models using TRL (Transformer Reinforcement Learning) or Unsloth with Hugging Face Jobs infrastructure. Covers SFT, DPO, GRPO and reward modeling training methods, plus GGUF conversion for local deployment. Includes guidance on the TRL Jobs package, UV scripts with PEP 723 format, dataset preparation and validation, hardware selection, cost estimation, Trackio monitoring, Hub authentication, model selection/leaderboards and model persistence. Use for tasks involving cloud GPU training, GGUF conversion, or when users mention training on Hugging Face Jobs without local GPU setup.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-llm-trainer/SKILL.md",[531,532,534,536,538,540,542,544,546,548,550,552,554,556,558,560,562,564,566],{"path":330,"priority":315},{"path":533,"priority":442},"references/gguf_conversion.md",{"path":535,"priority":442},"references/hardware_guide.md",{"path":537,"priority":442},"references/hub_saving.md",{"path":539,"priority":442},"references/local_training_macos.md",{"path":541,"priority":442},"references/reliability_principles.md",{"path":543,"priority":442},"references/trackio_guide.md",{"path":545,"priority":442},"references/training_methods.md",{"path":547,"priority":442},"references/training_patterns.md",{"path":549,"priority":442},"references/troubleshooting.md",{"path":551,"priority":442},"references/unsloth.md",{"path":553,"priority":504},"scripts/convert_to_gguf.py",{"path":555,"priority":504},"scripts/dataset_inspector.py",{"path":557,"priority":504},"scripts/estimate_cost.py",{"path":559,"priority":504},"scripts/hf_benchmarks.py",{"path":561,"priority":504},"scripts/train_dpo_example.py",{"path":563,"priority":504},"scripts/train_grpo_example.py",{"path":565,"priority":504},"scripts/train_sft_example.py",{"path":567,"priority":504},"scripts/unsloth_sft_example.py",{"basePath":332,"description":333,"displayName":334,"installMethods":569,"rationale":570,"selectedPaths":571,"source":321,"sourceLanguage":17,"type":483},{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-local-models/SKILL.md",[572,573,575,577],{"path":330,"priority":315},{"path":574,"priority":442},"references/hardware.md",{"path":576,"priority":442},"references/hub-discovery.md",{"path":578,"priority":442},"references/quantization.md",{"basePath":340,"description":341,"displayName":342,"installMethods":580,"rationale":581,"selectedPaths":582,"source":321,"sourceLanguage":17,"type":483},{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-paper-publisher/SKILL.md",[583,584,586,588,590,592,594,596],{"path":330,"priority":315},{"path":585,"priority":504},"examples/example_usage.md",{"path":587,"priority":442},"references/quick_reference.md",{"path":589,"priority":504},"scripts/paper_manager.py",{"path":591,"priority":504},"templates/arxiv.md",{"path":593,"priority":504},"templates/ml-report.md",{"path":595,"priority":504},"templates/modern.md",{"path":597,"priority":504},"templates/standard.md",{"basePath":348,"description":599,"displayName":350,"installMethods":600,"rationale":601,"selectedPaths":602,"source":321,"sourceLanguage":17,"type":483},"Look up and read Hugging Face paper pages in markdown, and use the papers API for structured metadata such as authors, linked models/datasets/spaces, Github repo and project page. Use when the user shares a Hugging Face paper page URL, an arXiv URL or ID, or asks to summarize, explain, or analyze an AI research paper.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-papers/SKILL.md",[603],{"path":330,"priority":315},{"basePath":396,"description":605,"displayName":398,"installMethods":606,"rationale":607,"selectedPaths":608,"source":321,"sourceLanguage":17,"type":483},"Use this skill when the user wants to build tool/scripts or achieve a task where using data from the Hugging Face API would help. This is especially useful when chaining or combining API calls or the task will be repeated/automated. This Skill creates a reusable script to fetch, enrich or process data.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-tool-builder/SKILL.md",[609,610,612,614,616,618,620,622],{"path":330,"priority":315},{"path":611,"priority":442},"references/baseline_hf_api.py",{"path":613,"priority":442},"references/baseline_hf_api.sh",{"path":615,"priority":442},"references/baseline_hf_api.tsx",{"path":617,"priority":442},"references/find_models_by_paper.sh",{"path":619,"priority":442},"references/hf_enrich_models.sh",{"path":621,"priority":442},"references/hf_model_card_frontmatter.sh",{"path":623,"priority":442},"references/hf_model_papers_auth.sh",{"basePath":380,"description":625,"displayName":382,"installMethods":626,"rationale":627,"selectedPaths":628,"source":321,"sourceLanguage":17,"type":483},"Track and visualize ML training experiments with Trackio. Use when logging metrics during training (Python API), firing alerts for training diagnostics, or retrieving/analyzing logged metrics (CLI). Supports real-time dashboard visualization, alerts with webhooks, HF Space syncing, and JSON output for automation.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-trackio/SKILL.md",[629,630,632,634],{"path":330,"priority":315},{"path":631,"priority":442},"references/alerts.md",{"path":633,"priority":442},"references/logging_metrics.md",{"path":635,"priority":442},"references/retrieving_metrics.md",{"basePath":259,"description":637,"displayName":12,"installMethods":638,"rationale":639,"selectedPaths":640,"source":321,"sourceLanguage":17,"type":483},"Trains and fine-tunes vision models for object detection (D-FINE, RT-DETR v2, DETR, YOLOS), image classification (timm models — MobileNetV3, MobileViT, ResNet, ViT/DINOv3 — plus any Transformers classifier), and SAM/SAM2 segmentation using Hugging Face Transformers on Hugging Face Jobs cloud GPUs. Covers COCO-format dataset preparation, Albumentations augmentation, mAP/mAR evaluation, accuracy metrics, SAM segmentation with bbox/point prompts, DiceCE loss, hardware selection, cost estimation, Trackio monitoring, and Hub persistence. Use when users mention training object detection, image classification, SAM, SAM2, segmentation, image matting, DETR, D-FINE, RT-DETR, ViT, timm, MobileNet, ResNet, bounding box models, or fine-tuning vision models on Hugging Face Jobs.",{"claudeCode":270},"SKILL.md frontmatter at skills/huggingface-vision-trainer/SKILL.md",[641,642,644,645,647,649,650,652,653,654,656,658],{"path":330,"priority":315},{"path":643,"priority":442},"references/finetune_sam2_trainer.md",{"path":537,"priority":442},{"path":646,"priority":442},"references/image_classification_training_notebook.md",{"path":648,"priority":442},"references/object_detection_training_notebook.md",{"path":541,"priority":442},{"path":651,"priority":442},"references/timm_trainer.md",{"path":555,"priority":504},{"path":557,"priority":504},{"path":655,"priority":504},"scripts/image_classification_training.py",{"path":657,"priority":504},"scripts/object_detection_training.py",{"path":659,"priority":504},"scripts/sam_segmentation_training.py",{"basePath":425,"description":661,"displayName":427,"installMethods":662,"rationale":663,"selectedPaths":664,"source":321,"sourceLanguage":17,"type":483},"Train or fine-tune sentence-transformers models across `SentenceTransformer` (bi-encoder; dense or static embedding model; for retrieval, similarity, clustering, classification, paraphrase mining, dedup, multimodal), `CrossEncoder` (reranker; pair scoring for two-stage retrieval / pair classification), and `SparseEncoder` (SPLADE, sparse embedding model; for learned-sparse retrieval). Covers loss selection, hard-negative mining, evaluators, distillation, LoRA, Matryoshka, and Hugging Face Hub publishing. Use for any sentence-transformers training task.",{"claudeCode":270},"SKILL.md frontmatter at skills/train-sentence-transformers/SKILL.md",[665,666,668,670,672,674,676,677,679,681,683,685,687,689,691,692,694,696,698,700,702,704,706,708,710,712,714,716],{"path":330,"priority":315},{"path":667,"priority":442},"references/base_model_selection.md",{"path":669,"priority":442},"references/dataset_formats.md",{"path":671,"priority":442},"references/evaluators_cross_encoder.md",{"path":673,"priority":442},"references/evaluators_sentence_transformer.md",{"path":675,"priority":442},"references/evaluators_sparse_encoder.md",{"path":535,"priority":442},{"path":678,"priority":442},"references/hf_jobs_execution.md",{"path":680,"priority":442},"references/losses_cross_encoder.md",{"path":682,"priority":442},"references/losses_sentence_transformer.md",{"path":684,"priority":442},"references/losses_sparse_encoder.md",{"path":686,"priority":442},"references/model_architectures.md",{"path":688,"priority":442},"references/prompts_and_instructions.md",{"path":690,"priority":442},"references/training_args.md",{"path":549,"priority":442},{"path":693,"priority":504},"scripts/mine_hard_negatives.py",{"path":695,"priority":504},"scripts/train_cross_encoder_distillation_example.py",{"path":697,"priority":504},"scripts/train_cross_encoder_example.py",{"path":699,"priority":504},"scripts/train_cross_encoder_listwise_example.py",{"path":701,"priority":504},"scripts/train_sentence_transformer_distillation_example.py",{"path":703,"priority":504},"scripts/train_sentence_transformer_example.py",{"path":705,"priority":504},"scripts/train_sentence_transformer_make_multilingual_example.py",{"path":707,"priority":504},"scripts/train_sentence_transformer_matryoshka_example.py",{"path":709,"priority":504},"scripts/train_sentence_transformer_multi_dataset_example.py",{"path":711,"priority":504},"scripts/train_sentence_transformer_static_embedding_example.py",{"path":713,"priority":504},"scripts/train_sentence_transformer_with_lora_example.py",{"path":715,"priority":504},"scripts/train_sparse_encoder_distillation_example.py",{"path":717,"priority":504},"scripts/train_sparse_encoder_example.py",{"basePath":412,"description":719,"displayName":414,"installMethods":720,"rationale":721,"selectedPaths":722,"source":321,"sourceLanguage":17,"type":483},"Use Transformers.js to run state-of-the-art machine learning models directly in JavaScript/TypeScript. Supports NLP (text classification, translation, summarization), computer vision (image classification, object detection), audio (speech recognition, audio classification), and multimodal tasks. Works in browsers and server-side runtimes (Node.js, Bun, Deno) with WebGPU/WASM using pre-trained models from Hugging Face Hub.",{"claudeCode":270},"SKILL.md frontmatter at skills/transformers-js/SKILL.md",[723,724,726,728,730,732,734,736],{"path":330,"priority":315},{"path":725,"priority":442},"references/CACHE.md",{"path":727,"priority":442},"references/CONFIGURATION.md",{"path":729,"priority":442},"references/EXAMPLES.md",{"path":731,"priority":442},"references/MODEL_ARCHITECTURES.md",{"path":733,"priority":442},"references/MODEL_REGISTRY.md",{"path":735,"priority":442},"references/PIPELINE_OPTIONS.md",{"path":737,"priority":442},"references/TEXT_GENERATION.md",{"sources":739},[740],"manual",{"closedIssues90d":247,"description":742,"forks":248,"homepage":743,"license":253,"openIssues90d":249,"pushedAt":250,"readmeSize":245,"stars":251,"topics":744},"Give your agents the power of the Hugging Face ecosystem","https://huggingface.co",[],{"classifiedAt":746,"discoverAt":747,"extractAt":748,"githubAt":748,"updatedAt":746},1778690772996,1778689536128,1778690770714,[225,229,227,224,226,231,228,230],{"evaluatedAt":257,"extractAt":297,"updatedAt":257},[],[753,784,805,824],{"_creationTime":754,"_id":755,"community":756,"display":757,"identity":762,"providers":766,"relations":777,"tags":780,"workflow":781},1778675056600.203,"k17a3t49yvhb9wjth9qywk121x86nvmw",{"reviewCount":8},{"description":758,"installMethods":759,"name":760,"sourceUrl":761},"Autonomous experiment loop that optimizes any file by a measurable metric. 5 slash commands, 8 evaluators, configurable loop intervals (10min to monthly).",{"claudeCode":760},"autoresearch-agent","https://github.com/alirezarezvani/claude-skills",{"basePath":763,"githubOwner":764,"githubRepo":765,"locale":17,"slug":760,"type":262},"engineering/autoresearch-agent","alirezarezvani","claude-skills",{"evaluate":767,"extract":774},{"promptVersionExtension":217,"promptVersionScoring":218,"score":768,"tags":769,"targetMarket":232,"tier":233},100,[770,771,772,773,224],"optimization","experimentation","automation","code-quality",{"commitSha":287,"license":775,"plugin":776},"MIT",{"mcpCount":8,"provider":291,"skillCount":249},{"parentExtensionId":778,"repoId":779},"k17dce6sbramb6sxm7ksr3928x86ncfs","kd7ff9s1w43mfyy1n7hf87816186m6px",[772,773,771,224,770],{"evaluatedAt":782,"extractAt":783,"updatedAt":782},1778675384189,1778675056600,{"_creationTime":785,"_id":786,"community":787,"display":788,"identity":790,"providers":791,"relations":801,"tags":802,"workflow":803},1778690773482.4858,"k175rwqsqyx8atwtz5cs5b3fpx86m84e",{"reviewCount":8},{"description":426,"installMethods":789,"name":427,"sourceUrl":13},{"claudeCode":427},{"basePath":425,"githubOwner":260,"githubRepo":261,"locale":17,"slug":427,"type":262},{"evaluate":792,"extract":800},{"promptVersionExtension":217,"promptVersionScoring":218,"score":793,"tags":794,"targetMarket":232,"tier":233},99,[224,795,796,797,798,799],"nlp","sentence-transformers","model-training","embeddings","reranking",{"commitSha":287},{"parentExtensionId":265,"repoId":293},[798,224,797,795,799,796],{"evaluatedAt":804,"extractAt":297,"updatedAt":804},1778691173389,{"_creationTime":806,"_id":807,"community":808,"display":809,"identity":811,"providers":812,"relations":820,"tags":821,"workflow":822},1778690773482.4854,"k17745362t936z67p0p8w8mq0h86nmf0",{"reviewCount":8},{"description":413,"installMethods":810,"name":414,"sourceUrl":13},{"claudeCode":414},{"basePath":412,"githubOwner":260,"githubRepo":261,"locale":17,"slug":414,"type":262},{"evaluate":813,"extract":819},{"promptVersionExtension":217,"promptVersionScoring":218,"score":221,"tags":814,"targetMarket":232,"tier":233},[224,815,816,795,225,817,818],"javascript","typescript","audio","multimodal",{"commitSha":287},{"parentExtensionId":265,"repoId":293},[817,225,815,224,818,795,816],{"evaluatedAt":823,"extractAt":297,"updatedAt":823},1778691120894,{"_creationTime":825,"_id":826,"community":827,"display":828,"identity":834,"providers":837,"relations":851,"tags":854,"workflow":855},1778695383013.7256,"k170nrxqt2qdk7aqgpj3k2wk5986mece",{"reviewCount":8},{"description":829,"installMethods":830,"name":832,"sourceUrl":833},"Market research skills for PMs: user personas, market segmentation, sentiment analysis, and competitive analysis.",{"claudeCode":831},"pm-market-research","PM Market Research","https://github.com/phuryn/pm-skills",{"basePath":831,"githubOwner":835,"githubRepo":836,"locale":17,"slug":831,"type":262},"phuryn","pm-skills",{"evaluate":838,"extract":848},{"promptVersionExtension":217,"promptVersionScoring":218,"score":839,"tags":840,"targetMarket":232,"tier":847},92,[841,842,843,228,844,845,846],"product-management","market-research","personas","competitive-analysis","sentiment-analysis","market-sizing","community",{"commitSha":287,"license":775,"plugin":849},{"mcpCount":8,"provider":291,"skillCount":850},7,{"parentExtensionId":852,"repoId":853},"k172xh7pnzf1sa7ch900am209d86mvxj","kd759mc43bg9ypk46ka87r3wa586npt5",[844,842,846,843,841,228,845],{"evaluatedAt":856,"extractAt":857,"updatedAt":856},1778695467809,1778695383013]