[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"extension-plugin-huggingface-train-sentence-transformers-en":3,"guides-for-huggingface-train-sentence-transformers":744,"similar-k175rwqsqyx8atwtz5cs5b3fpx86m84e-en":745},{"_creationTime":4,"_id":5,"children":6,"community":7,"display":9,"evaluation":14,"identity":251,"isFallback":234,"parentExtension":256,"providers":291,"relations":295,"repo":296,"tags":742,"workflow":743},1778690773482.4858,"k175rwqsqyx8atwtz5cs5b3fpx86m84e",[],{"reviewCount":8},0,{"description":10,"installMethods":11,"name":12,"sourceUrl":13},"Train or fine-tune sentence-transformers models across all three architectures: SentenceTransformer (bi-encoder embeddings), CrossEncoder (rerankers), and SparseEncoder (SPLADE). Covers loss selection, hard-negative mining, evaluators, distillation, LoRA, Matryoshka, and Hugging Face Hub publishing.",{"claudeCode":12},"train-sentence-transformers","https://github.com/huggingface/skills",{"_creationTime":15,"_id":16,"extensionId":5,"locale":17,"result":18,"trustSignals":232,"workflow":249},1778691173389.3933,"kn7fsp59wnwt4h7nrx2a6jptwd86m46s","en",{"checks":19,"evaluatedAt":200,"extensionSummary":201,"features":202,"nonGoals":208,"promptVersionExtension":212,"promptVersionScoring":213,"purpose":214,"rationale":215,"score":216,"summary":217,"tags":218,"targetMarket":225,"tier":226,"useCases":227},[20,25,28,31,35,38,42,46,49,52,56,60,64,68,71,74,77,80,83,86,90,94,98,102,106,109,112,115,119,122,125,128,131,134,137,141,145,149,152,156,159,162,165,168,171,174,177,180,183,186,190,193,196],{"category":21,"check":22,"severity":23,"summary":24},"Practical Utility","Problem relevance","pass","The description clearly names the problem of training or fine-tuning sentence-transformers models across three architectures and lists specific techniques covered.",{"category":21,"check":26,"severity":23,"summary":27},"Unique selling proposition","The extension provides a structured approach and specific templates for training sentence-transformers models, going beyond basic API wrappers and offering value for complex ML tasks.",{"category":21,"check":29,"severity":23,"summary":30},"Production readiness","The extension covers the complete lifecycle for training sentence-transformers models, including setup, training, evaluation, and Hub publishing, making it suitable for production workflows.",{"category":32,"check":33,"severity":23,"summary":34},"Scope","Single responsibility principle","The plugin focuses on the specific domain of training and fine-tuning sentence-transformers models, adhering to a single responsibility.",{"category":32,"check":36,"severity":23,"summary":37},"Description quality","The provided description accurately reflects the plugin's capabilities, including the three architectures and the specific techniques covered.",{"category":39,"check":40,"severity":23,"summary":41},"Invocation","Scoped tools","The plugin's skill, 'train-sentence-transformers', acts as a router to specific reference materials and scripts, effectively scoping its actions.",{"category":43,"check":44,"severity":23,"summary":45},"Documentation","Configuration & parameter reference","The SKILL.md provides comprehensive references for losses, evaluators, training arguments, dataset formats, base model selection, and troubleshooting, covering necessary configurations and defaults.",{"category":32,"check":47,"severity":23,"summary":48},"Tool naming","The single exposed tool/skill name 'train-sentence-transformers' is descriptive and aligns with the plugin's function.",{"category":32,"check":50,"severity":23,"summary":51},"Minimal I/O surface","The plugin's interaction model relies on user prompts and references to documentation/scripts, with no excessive I/O surfaces exposed.",{"category":53,"check":54,"severity":23,"summary":55},"License","License usability","The plugin is licensed under the Apache-2.0 license, which is a permissive open-source license.",{"category":57,"check":58,"severity":23,"summary":59},"Maintenance","Commit recency","The last commit was on 2026-05-12, which is recent and indicates active maintenance.",{"category":57,"check":61,"severity":62,"summary":63},"Dependency Management","not_applicable","The extension's primary function is providing guidance and scripts, and it does not appear to have complex third-party dependencies requiring explicit management beyond its stated Python requirements.",{"category":65,"check":66,"severity":23,"summary":67},"Security","Secret Management","The plugin guides users to set up authentication via 'hf auth login' or environment variables, and the scripts are designed to use these securely without echoing secrets.",{"category":65,"check":69,"severity":23,"summary":70},"Injection","The extension relies on documented scripts and references, treating all external data as instructional content for the agent and not as executable code.",{"category":65,"check":72,"severity":23,"summary":73},"Transitive Supply-Chain Grenades","The plugin uses documented scripts and references within its repository, and external dependencies are managed via standard pip installs, avoiding runtime external content execution.",{"category":65,"check":75,"severity":23,"summary":76},"Sandbox Isolation","The plugin's scripts and guidance operate within the user's project context and do not attempt to modify files outside of the intended workflow.",{"category":65,"check":78,"severity":23,"summary":79},"Sandbox escape primitives","No evidence of detached processes or retry loops around denied tool calls is found in the provided script structure.",{"category":65,"check":81,"severity":23,"summary":82},"Data Exfiltration","The plugin's focus on local training and Hub publishing via authenticated CLI commands, with explicit user action, prevents undocumented outbound data submission.",{"category":65,"check":84,"severity":23,"summary":85},"Hidden Text Tricks","The bundled content (SKILL.md, references) appears free of hidden steering tricks, using standard markdown and clear prose.",{"category":87,"check":88,"severity":23,"summary":89},"Hooks","Opaque code execution","The plugin's scripts are provided as readable Python and bash, with no evidence of obfuscation, base64 payloads, or runtime URL fetches for code.",{"category":91,"check":92,"severity":23,"summary":93},"Portability","Structural Assumption","The plugin guides users to copy and modify scripts within their project context and outlines clear prerequisites, minimizing assumptions about external project structure.",{"category":95,"check":96,"severity":23,"summary":97},"Trust","Issues Attention","With 4 issues opened and 6 closed in the last 90 days, the closure rate is high (60%), indicating active maintainer engagement.",{"category":99,"check":100,"severity":23,"summary":101},"Versioning","Release Management","The plugin's versioning is implicitly managed through the repository's Git tags and the `pushedAt` timestamp, with clear installation via `plugin install \u003Cskill-name>@huggingface/skills` implying versioning through the repo's commit history.",{"category":103,"check":104,"severity":23,"summary":105},"Code Execution","Validation","The provided scripts and documentation emphasize structured inputs and outputs, referencing validation through standard libraries and explicit user guidance, aligning with best practices.",{"category":65,"check":107,"severity":23,"summary":108},"Unguarded Destructive Operations","The primary destructive operation is pushing to the Hugging Face Hub, which requires explicit authentication and user action, with no automated or unguarded destructive primitives.",{"category":103,"check":110,"severity":23,"summary":111},"Error Handling","The referenced documentation and script templates emphasize structured error handling, reporting, and non-zero exits for failures, enabling the agent to manage errors effectively.",{"category":103,"check":113,"severity":23,"summary":114},"Logging","The plugin's workflow explicitly includes teeing logs to files and using trackers like Trackio, providing an audit trail for executed actions.",{"category":116,"check":117,"severity":23,"summary":118},"Compliance","GDPR","The plugin focuses on model training and does not inherently operate on personal data; any data submitted for training would be user-provided and handled by standard Hugging Face libraries.",{"category":116,"check":120,"severity":23,"summary":121},"Target market","The extension is global in scope, providing tools for model training applicable anywhere without regional restrictions.",{"category":91,"check":123,"severity":23,"summary":124},"Runtime stability","The plugin relies on standard Python packages and Hugging Face libraries, ensuring broad compatibility across POSIX and Windows environments.",{"category":43,"check":126,"severity":23,"summary":127},"README","The README is comprehensive, detailing installation, usage, and skill organization within the Hugging Face ecosystem.",{"category":32,"check":129,"severity":62,"summary":130},"Tool surface size","This is a plugin providing a single skill, not a collection of multiple tools.",{"category":39,"check":132,"severity":62,"summary":133},"Overlapping near-synonym tools","This plugin provides a single, well-defined skill; there are no overlapping tools.",{"category":43,"check":135,"severity":23,"summary":136},"Phantom features","All features mentioned in the README, such as support for different architectures and techniques, are detailed in the SKILL.md and its referenced documentation.",{"category":138,"check":139,"severity":23,"summary":140},"Install","Installation instruction","The README provides clear installation instructions for multiple agents (Claude Code, Codex, Gemini CLI, Cursor) and includes copy-pasteable invocation examples.",{"category":142,"check":143,"severity":23,"summary":144},"Errors","Actionable error messages","The referenced documentation and example scripts guide users on handling errors, including specific troubleshooting steps and expected outcomes.",{"category":146,"check":147,"severity":23,"summary":148},"Execution","Pinned dependencies","The prerequisites specify pip installs with version constraints (e.g., '>=5.0'), indicating pinned dependencies.",{"category":32,"check":150,"severity":23,"summary":151},"Dry-run preview","While not a direct `--dry-run` flag on the skill itself, the emphasis on script templates and clear output logs allows users to preview intended actions before full execution and Hub push.",{"category":153,"check":154,"severity":23,"summary":155},"Protocol","Idempotent retry & timeouts","The plugin's approach to training and Hub interaction, combined with standard library handling, implies idempotency for critical operations and reliance on underlying Python/HF library timeouts.",{"category":116,"check":157,"severity":23,"summary":158},"Telemetry opt-in","The plugin leverages user-controlled Hugging Face Hub authentication and local execution, with optional tracking via Trackio, adhering to opt-in telemetry principles.",{"category":39,"check":160,"severity":62,"summary":161},"Name collisions","This is a single plugin with one skill, so no name collisions within the bundle are possible.",{"category":39,"check":163,"severity":62,"summary":164},"Hooks-off mechanism","This plugin primarily provides a skill and does not appear to implement user-configurable hooks.",{"category":39,"check":166,"severity":62,"summary":167},"Hook matcher tightness","The plugin does not utilize hooks that require specific matchers.",{"category":65,"check":169,"severity":62,"summary":170},"Hook security","The plugin does not appear to implement any hooks that require security gating.",{"category":87,"check":172,"severity":62,"summary":173},"Silent prompt rewriting","The plugin does not implement UserPromptSubmit hooks that would rewrite prompts silently.",{"category":65,"check":175,"severity":62,"summary":176},"Permission Hook","No PermissionRequest hooks are present in this plugin.",{"category":116,"check":178,"severity":62,"summary":179},"Hook privacy","The plugin does not implement hooks that would send data over the network.",{"category":103,"check":181,"severity":62,"summary":182},"Hook dependency","There are no hooks in this plugin that would rely on external scripts or binaries.",{"category":43,"check":184,"severity":23,"summary":185},"Feature Transparency","All critical functionalities, including the different model types and training techniques, are explained in the SKILL.md and its referenced documentation.",{"category":187,"check":188,"severity":62,"summary":189},"Convention","Layout convention adherence","As this appears to be a single skill plugin, the specific plugin directory structure conventions for multiple skills or bin directory entries are not applicable.",{"category":187,"check":191,"severity":62,"summary":192},"Plugin state","This plugin does not appear to maintain persistent state beyond user-provided data during a training run.",{"category":65,"check":194,"severity":23,"summary":195},"Keychain-stored secrets","The plugin guides users to use 'hf auth login' or environment variables for authentication, which are handled securely by the Hugging Face CLI and Python libraries, aligning with best practices for secret management.",{"category":197,"check":198,"severity":23,"summary":199},"Installation","Clean uninstall","The plugin's installation involves registering commands and potentially installing Python packages, which standard agent uninstall procedures should handle cleanly without leaving background daemons or persistent system changes.",1778691173266,"This plugin provides a single skill, 'train-sentence-transformers', which acts as a router to extensive documentation and example scripts for training Hugging Face sentence-transformers models. It covers bi-encoder, cross-encoder, and sparse-encoder architectures, detailing loss selection, hard-negative mining, evaluators, and deployment to Hugging Face Hub.",[203,204,205,206,207],"Supports SentenceTransformer, CrossEncoder, and SparseEncoder architectures","Covers loss selection, hard-negative mining, and evaluators","Includes guidance on LoRA, Matryoshka, and distillation","Facilitates Hugging Face Hub publishing","Provides production-ready example scripts and detailed references",[209,210,211],"Synthesizing training scripts from scratch without using provided templates.","Replacing the core Hugging Face `transformers` or `sentence-transformers` libraries.","Providing a GUI for model training.","3.0.0","4.4.0","To provide a structured and comprehensive system for users to train or fine-tune sentence-transformers models across various architectures and techniques, simplifying complex ML workflows.","The plugin is exceptionally well-documented and structured, with comprehensive references and clear guidance. It adheres to best practices across security, error handling, and installation.",99,"A high-quality plugin for training sentence-transformers models, offering comprehensive guidance and scripts.",[219,220,221,222,223,224],"machine-learning","nlp","sentence-transformers","model-training","embeddings","reranking","global","verified",[228,229,230,231],"Training sentence-transformers for retrieval, similarity search, or clustering.","Fine-tuning models for specific downstream tasks like classification or reranking.","Implementing SPLADE models for sparse retrieval systems.","Exploring advanced training techniques like LoRA or distillation.",{"codeQuality":233,"collectedAt":235,"documentation":236,"maintenance":239,"security":245,"testCoverage":247},{"hasLockfile":234},false,1778691153525,{"descriptionLength":237,"readmeSize":238},300,9821,{"closedIssues90d":240,"forks":241,"hasChangelog":234,"openIssues90d":242,"pushedAt":243,"stars":244},6,663,4,1778593131000,10482,{"hasNpmPackage":234,"license":246,"smitheryVerified":234},"Apache-2.0",{"hasCi":248,"hasTests":234},true,{"updatedAt":250},1778691173389,{"basePath":252,"githubOwner":253,"githubRepo":254,"locale":17,"slug":12,"type":255},"skills/train-sentence-transformers","huggingface","skills","plugin",{"_creationTime":257,"_id":258,"community":259,"display":260,"identity":265,"parentExtension":268,"providers":269,"relations":285,"tags":287,"workflow":288},1778690773482.4824,"k17es3r8wd37t5rrwqcpp5kwrh86mxx8",{"reviewCount":8},{"description":261,"installMethods":262,"name":264,"sourceUrl":13},"Agent Skills for AI/ML tasks including dataset creation, model training, evaluation, and research paper publishing on Hugging Face Hub",{"claudeCode":263},"huggingface/skills","huggingface-skills",{"basePath":266,"githubOwner":253,"githubRepo":254,"locale":17,"slug":254,"type":267},"","marketplace",null,{"evaluate":270,"extract":279},{"promptVersionExtension":271,"promptVersionScoring":213,"score":272,"tags":273,"targetMarket":225,"tier":226},"3.1.0",95,[274,253,275,276,277,278],"ai-ml","datasets","models","research","developer-tools",{"commitSha":280,"marketplace":281,"plugin":283},"HEAD",{"name":264,"pluginCount":282},14,{"mcpCount":8,"provider":284,"skillCount":8},"classify",{"repoId":286},"kd72xwt5xnc0ktc4p7smzfcp3986m959",[274,275,278,253,276,277],{"evaluatedAt":289,"extractAt":290,"updatedAt":289},1778690814090,1778690773482,{"evaluate":292,"extract":294},{"promptVersionExtension":212,"promptVersionScoring":213,"score":216,"tags":293,"targetMarket":225,"tier":226},[219,220,221,222,223,224],{"commitSha":280},{"parentExtensionId":258,"repoId":286},{"_creationTime":297,"_id":286,"identity":298,"providers":299,"workflow":738},1778689536128.5474,{"githubOwner":253,"githubRepo":254,"sourceUrl":13},{"classify":300,"discover":731,"github":734},{"commitSha":280,"extensions":301},[302,315,324,332,340,348,356,364,372,380,388,396,404,412,420,425,468,477,483,489,506,512,519,561,572,591,597,617,629,653,711],{"basePath":266,"description":261,"displayName":264,"installMethods":303,"rationale":304,"selectedPaths":305,"source":314,"sourceLanguage":17,"type":267},{"claudeCode":263},"marketplace.json at .claude-plugin/marketplace.json",[306,309,311],{"path":307,"priority":308},".claude-plugin/marketplace.json","mandatory",{"path":310,"priority":308},"README.md",{"path":312,"priority":313},"LICENSE","high","rule",{"basePath":316,"description":317,"displayName":318,"installMethods":319,"rationale":320,"selectedPaths":321,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-llm-trainer","Train or fine-tune language models using TRL on Hugging Face Jobs infrastructure. Covers SFT, DPO, GRPO and reward modeling training methods, plus GGUF conversion for local deployment. Includes hardware selection, cost estimation, Trackio monitoring, and Hub persistence.","huggingface-llm-trainer",{"claudeCode":318},"inline plugin source from marketplace.json at skills/huggingface-llm-trainer",[322],{"path":323,"priority":313},"SKILL.md",{"basePath":325,"description":326,"displayName":327,"installMethods":328,"rationale":329,"selectedPaths":330,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-local-models","Use to select models to run locally with llama.cpp and GGUF on CPU, Mac Metal, CUDA, or ROCm. Covers finding GGUFs, quant selection, running servers, exact GGUF file lookup, conversion, and OpenAI-compatible local serving.","huggingface-local-models",{"claudeCode":327},"inline plugin source from marketplace.json at skills/huggingface-local-models",[331],{"path":323,"priority":313},{"basePath":333,"description":334,"displayName":335,"installMethods":336,"rationale":337,"selectedPaths":338,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-paper-publisher","Publish and manage research papers on Hugging Face Hub. Supports creating paper pages, linking papers to models/datasets, claiming authorship, and generating professional markdown-based research articles.","huggingface-paper-publisher",{"claudeCode":335},"inline plugin source from marketplace.json at skills/huggingface-paper-publisher",[339],{"path":323,"priority":313},{"basePath":341,"description":342,"displayName":343,"installMethods":344,"rationale":345,"selectedPaths":346,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-papers","Look up and read Hugging Face paper pages in markdown, and use the papers API for structured metadata like authors, linked models, datasets, Spaces, and media URLs when needed.","huggingface-papers",{"claudeCode":343},"inline plugin source from marketplace.json at skills/huggingface-papers",[347],{"path":323,"priority":313},{"basePath":349,"description":350,"displayName":351,"installMethods":352,"rationale":353,"selectedPaths":354,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-community-evals","Add and manage evaluation results in Hugging Face model cards. Supports extracting eval tables from README content, importing scores from Artificial Analysis API, and running custom evaluations with vLLM/lighteval.","huggingface-community-evals",{"claudeCode":351},"inline plugin source from marketplace.json at skills/huggingface-community-evals",[355],{"path":323,"priority":313},{"basePath":357,"description":358,"displayName":359,"installMethods":360,"rationale":361,"selectedPaths":362,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-best","Find the best AI model for any task by querying Hugging Face leaderboards and benchmarks. Recommends top models based on task type, hardware constraints, and benchmark scores.","huggingface-best",{"claudeCode":359},"inline plugin source from marketplace.json at skills/huggingface-best",[363],{"path":323,"priority":313},{"basePath":365,"description":366,"displayName":367,"installMethods":368,"rationale":369,"selectedPaths":370,"source":314,"sourceLanguage":17,"type":255},"skills/hf-cli","Execute Hugging Face Hub operations using the hf CLI. Download models/datasets, upload files, manage repos, and run cloud compute jobs.","hf-cli",{"claudeCode":367},"inline plugin source from marketplace.json at skills/hf-cli",[371],{"path":323,"priority":313},{"basePath":373,"description":374,"displayName":375,"installMethods":376,"rationale":377,"selectedPaths":378,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-trackio","Track and visualize ML training experiments with Trackio. Log metrics via Python API and retrieve them via CLI. Supports real-time dashboards synced to HF Spaces.","huggingface-trackio",{"claudeCode":375},"inline plugin source from marketplace.json at skills/huggingface-trackio",[379],{"path":323,"priority":313},{"basePath":381,"description":382,"displayName":383,"installMethods":384,"rationale":385,"selectedPaths":386,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-datasets","Explore, query, and extract data from any Hugging Face dataset using the Dataset Viewer REST API and npx tooling. Zero Python dependencies — covers split/config discovery, row pagination, text search, filtering, SQL via parquetlens, and dataset upload via CLI.","huggingface-datasets",{"claudeCode":383},"inline plugin source from marketplace.json at skills/huggingface-datasets",[387],{"path":323,"priority":313},{"basePath":389,"description":390,"displayName":391,"installMethods":392,"rationale":393,"selectedPaths":394,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-tool-builder","Build reusable scripts for Hugging Face Hub and API workflows. Useful for chaining API calls, enriching Hub metadata, or automating repeated tasks.","huggingface-tool-builder",{"claudeCode":391},"inline plugin source from marketplace.json at skills/huggingface-tool-builder",[395],{"path":323,"priority":313},{"basePath":397,"description":398,"displayName":399,"installMethods":400,"rationale":401,"selectedPaths":402,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-gradio","Build Gradio web UIs and demos in Python. Use when creating or editing Gradio apps, components, event listeners, layouts, or chatbots.","huggingface-gradio",{"claudeCode":399},"inline plugin source from marketplace.json at skills/huggingface-gradio",[403],{"path":323,"priority":313},{"basePath":405,"description":406,"displayName":407,"installMethods":408,"rationale":409,"selectedPaths":410,"source":314,"sourceLanguage":17,"type":255},"skills/transformers-js","Run state-of-the-art machine learning models directly in JavaScript/TypeScript for NLP, computer vision, audio processing, and multimodal tasks. Works in Node.js and browsers with WebGPU/WASM using Hugging Face models.","transformers-js",{"claudeCode":407},"inline plugin source from marketplace.json at skills/transformers-js",[411],{"path":323,"priority":313},{"basePath":413,"description":414,"displayName":415,"installMethods":416,"rationale":417,"selectedPaths":418,"source":314,"sourceLanguage":17,"type":255},"skills/huggingface-vision-trainer","Train and fine-tune object detection models (RTDETRv2, YOLOS, DETR and others) and image classification models (timm and transformers models — MobileNetV3, MobileViT, ResNet, ViT/DINOv3) using Transformers Trainer API on Hugging Face Jobs infrastructure or locally. Includes COCO dataset format support, Albumentations augmentation, mAP/mAR metrics, trackio tracking, hardware selection, and Hub persistence.","huggingface-vision-trainer",{"claudeCode":415},"inline plugin source from marketplace.json at skills/huggingface-vision-trainer",[419],{"path":323,"priority":313},{"basePath":252,"description":10,"displayName":12,"installMethods":421,"rationale":422,"selectedPaths":423,"source":314,"sourceLanguage":17,"type":255},{"claudeCode":12},"inline plugin source from marketplace.json at skills/train-sentence-transformers",[424],{"path":323,"priority":313},{"basePath":266,"description":261,"displayName":264,"installMethods":426,"license":246,"rationale":427,"selectedPaths":428,"source":314,"sourceLanguage":17,"type":255},{"claudeCode":264},"plugin manifest at .claude-plugin/plugin.json",[429,431,432,433,436,438,440,442,444,446,448,450,452,454,456,458,460,462,464,466],{"path":430,"priority":308},".claude-plugin/plugin.json",{"path":310,"priority":308},{"path":312,"priority":313},{"path":434,"priority":435},"skills/hf-cli/SKILL.md","medium",{"path":437,"priority":435},"skills/huggingface-best/SKILL.md",{"path":439,"priority":435},"skills/huggingface-community-evals/SKILL.md",{"path":441,"priority":435},"skills/huggingface-datasets/SKILL.md",{"path":443,"priority":435},"skills/huggingface-gradio/SKILL.md",{"path":445,"priority":435},"skills/huggingface-llm-trainer/SKILL.md",{"path":447,"priority":435},"skills/huggingface-local-models/SKILL.md",{"path":449,"priority":435},"skills/huggingface-paper-publisher/SKILL.md",{"path":451,"priority":435},"skills/huggingface-papers/SKILL.md",{"path":453,"priority":435},"skills/huggingface-tool-builder/SKILL.md",{"path":455,"priority":435},"skills/huggingface-trackio/SKILL.md",{"path":457,"priority":435},"skills/huggingface-vision-trainer/SKILL.md",{"path":459,"priority":435},"skills/train-sentence-transformers/SKILL.md",{"path":461,"priority":435},"skills/transformers-js/SKILL.md",{"path":463,"priority":308},".mcp.json",{"path":465,"priority":313},"agents/AGENTS.md",{"path":467,"priority":313},".cursor-plugin/plugin.json",{"basePath":469,"description":470,"displayName":471,"installMethods":472,"rationale":473,"selectedPaths":474,"source":314,"sourceLanguage":17,"type":476},"hf-mcp/skills/hf-mcp","Use Hugging Face Hub via MCP server tools. Search models, datasets, Spaces, papers. Get repo details, fetch documentation, run compute jobs, and use Gradio Spaces as AI tools. Available when connected to the HF MCP server.","hf-mcp",{"claudeCode":263},"SKILL.md frontmatter at hf-mcp/skills/hf-mcp/SKILL.md",[475],{"path":323,"priority":308},"skill",{"basePath":365,"description":478,"displayName":367,"installMethods":479,"rationale":480,"selectedPaths":481,"source":314,"sourceLanguage":17,"type":476},"Hugging Face Hub CLI (`hf`) for downloading, uploading, and managing models, datasets, spaces, buckets, repos, papers, jobs, and more on the Hugging Face Hub. Use when: handling authentication; managing local cache; managing Hugging Face Buckets; running or scheduling jobs on Hugging Face infrastructure; managing Hugging Face repos; discussions and pull requests; browsing models, datasets and spaces; reading, searching, or browsing academic papers; managing collections; querying datasets; configuring spaces; setting up webhooks; or deploying and managing HF Inference Endpoints. Make sure to use this skill whenever the user mentions 'hf', 'huggingface', 'Hugging Face', 'huggingface-cli', or 'hugging face cli', or wants to do anything related to the Hugging Face ecosystem and to AI and ML in general. Also use for cloud storage needs like training checkpoints, data pipelines, or agent traces. Use even if the user doesn't explicitly ask for a CLI command. Replaces the deprecated `huggingface-cli`.",{"claudeCode":263},"SKILL.md frontmatter at skills/hf-cli/SKILL.md",[482],{"path":323,"priority":308},{"basePath":357,"description":484,"displayName":359,"installMethods":485,"rationale":486,"selectedPaths":487,"source":314,"sourceLanguage":17,"type":476},"Use when the user asks about finding the best, top, or recommended model for a task, wants to know what AI model to use, or wants to compare models by benchmark scores. Triggers on: \"best model for X\", \"what model should I use for\", \"top models for [task]\", \"which model runs on my laptop/machine/device\", \"recommend a model for\", \"what LLM should I use for\", \"compare models for\", \"what's state of the art for\", or any question about choosing an AI model for a specific use case. Always use this skill when the user wants model recommendations or comparisons, even if they don't explicitly mention HuggingFace or benchmarks.\n",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-best/SKILL.md",[488],{"path":323,"priority":308},{"basePath":349,"description":490,"displayName":351,"installMethods":491,"rationale":492,"selectedPaths":493,"source":314,"sourceLanguage":17,"type":476},"Run evaluations for Hugging Face Hub models using inspect-ai and lighteval on local hardware. Use for backend selection, local GPU evals, and choosing between vLLM / Transformers / accelerate. Not for HF Jobs orchestration, model-card PRs, .eval_results publication, or community-evals automation.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-community-evals/SKILL.md",[494,495,498,500,502,504],{"path":323,"priority":308},{"path":496,"priority":497},"examples/.env.example","low",{"path":499,"priority":497},"examples/USAGE_EXAMPLES.md",{"path":501,"priority":497},"scripts/inspect_eval_uv.py",{"path":503,"priority":497},"scripts/inspect_vllm_uv.py",{"path":505,"priority":497},"scripts/lighteval_vllm_uv.py",{"basePath":381,"description":507,"displayName":383,"installMethods":508,"rationale":509,"selectedPaths":510,"source":314,"sourceLanguage":17,"type":476},"Use this skill for Hugging Face Dataset Viewer API workflows that fetch subset/split metadata, paginate rows, search text, apply filters, download parquet URLs, and read size or statistics.\r",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-datasets/SKILL.md",[511],{"path":323,"priority":308},{"basePath":397,"description":398,"displayName":399,"installMethods":513,"rationale":514,"selectedPaths":515,"source":314,"sourceLanguage":17,"type":476},{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-gradio/SKILL.md",[516,517],{"path":323,"priority":308},{"path":518,"priority":435},"examples.md",{"basePath":316,"description":520,"displayName":318,"installMethods":521,"rationale":522,"selectedPaths":523,"source":314,"sourceLanguage":17,"type":476},"Train or fine-tune language and vision models using TRL (Transformer Reinforcement Learning) or Unsloth with Hugging Face Jobs infrastructure. Covers SFT, DPO, GRPO and reward modeling training methods, plus GGUF conversion for local deployment. Includes guidance on the TRL Jobs package, UV scripts with PEP 723 format, dataset preparation and validation, hardware selection, cost estimation, Trackio monitoring, Hub authentication, model selection/leaderboards and model persistence. Use for tasks involving cloud GPU training, GGUF conversion, or when users mention training on Hugging Face Jobs without local GPU setup.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-llm-trainer/SKILL.md",[524,525,527,529,531,533,535,537,539,541,543,545,547,549,551,553,555,557,559],{"path":323,"priority":308},{"path":526,"priority":435},"references/gguf_conversion.md",{"path":528,"priority":435},"references/hardware_guide.md",{"path":530,"priority":435},"references/hub_saving.md",{"path":532,"priority":435},"references/local_training_macos.md",{"path":534,"priority":435},"references/reliability_principles.md",{"path":536,"priority":435},"references/trackio_guide.md",{"path":538,"priority":435},"references/training_methods.md",{"path":540,"priority":435},"references/training_patterns.md",{"path":542,"priority":435},"references/troubleshooting.md",{"path":544,"priority":435},"references/unsloth.md",{"path":546,"priority":497},"scripts/convert_to_gguf.py",{"path":548,"priority":497},"scripts/dataset_inspector.py",{"path":550,"priority":497},"scripts/estimate_cost.py",{"path":552,"priority":497},"scripts/hf_benchmarks.py",{"path":554,"priority":497},"scripts/train_dpo_example.py",{"path":556,"priority":497},"scripts/train_grpo_example.py",{"path":558,"priority":497},"scripts/train_sft_example.py",{"path":560,"priority":497},"scripts/unsloth_sft_example.py",{"basePath":325,"description":326,"displayName":327,"installMethods":562,"rationale":563,"selectedPaths":564,"source":314,"sourceLanguage":17,"type":476},{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-local-models/SKILL.md",[565,566,568,570],{"path":323,"priority":308},{"path":567,"priority":435},"references/hardware.md",{"path":569,"priority":435},"references/hub-discovery.md",{"path":571,"priority":435},"references/quantization.md",{"basePath":333,"description":334,"displayName":335,"installMethods":573,"rationale":574,"selectedPaths":575,"source":314,"sourceLanguage":17,"type":476},{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-paper-publisher/SKILL.md",[576,577,579,581,583,585,587,589],{"path":323,"priority":308},{"path":578,"priority":497},"examples/example_usage.md",{"path":580,"priority":435},"references/quick_reference.md",{"path":582,"priority":497},"scripts/paper_manager.py",{"path":584,"priority":497},"templates/arxiv.md",{"path":586,"priority":497},"templates/ml-report.md",{"path":588,"priority":497},"templates/modern.md",{"path":590,"priority":497},"templates/standard.md",{"basePath":341,"description":592,"displayName":343,"installMethods":593,"rationale":594,"selectedPaths":595,"source":314,"sourceLanguage":17,"type":476},"Look up and read Hugging Face paper pages in markdown, and use the papers API for structured metadata such as authors, linked models/datasets/spaces, Github repo and project page. Use when the user shares a Hugging Face paper page URL, an arXiv URL or ID, or asks to summarize, explain, or analyze an AI research paper.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-papers/SKILL.md",[596],{"path":323,"priority":308},{"basePath":389,"description":598,"displayName":391,"installMethods":599,"rationale":600,"selectedPaths":601,"source":314,"sourceLanguage":17,"type":476},"Use this skill when the user wants to build tool/scripts or achieve a task where using data from the Hugging Face API would help. This is especially useful when chaining or combining API calls or the task will be repeated/automated. This Skill creates a reusable script to fetch, enrich or process data.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-tool-builder/SKILL.md",[602,603,605,607,609,611,613,615],{"path":323,"priority":308},{"path":604,"priority":435},"references/baseline_hf_api.py",{"path":606,"priority":435},"references/baseline_hf_api.sh",{"path":608,"priority":435},"references/baseline_hf_api.tsx",{"path":610,"priority":435},"references/find_models_by_paper.sh",{"path":612,"priority":435},"references/hf_enrich_models.sh",{"path":614,"priority":435},"references/hf_model_card_frontmatter.sh",{"path":616,"priority":435},"references/hf_model_papers_auth.sh",{"basePath":373,"description":618,"displayName":375,"installMethods":619,"rationale":620,"selectedPaths":621,"source":314,"sourceLanguage":17,"type":476},"Track and visualize ML training experiments with Trackio. Use when logging metrics during training (Python API), firing alerts for training diagnostics, or retrieving/analyzing logged metrics (CLI). Supports real-time dashboard visualization, alerts with webhooks, HF Space syncing, and JSON output for automation.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-trackio/SKILL.md",[622,623,625,627],{"path":323,"priority":308},{"path":624,"priority":435},"references/alerts.md",{"path":626,"priority":435},"references/logging_metrics.md",{"path":628,"priority":435},"references/retrieving_metrics.md",{"basePath":413,"description":630,"displayName":415,"installMethods":631,"rationale":632,"selectedPaths":633,"source":314,"sourceLanguage":17,"type":476},"Trains and fine-tunes vision models for object detection (D-FINE, RT-DETR v2, DETR, YOLOS), image classification (timm models — MobileNetV3, MobileViT, ResNet, ViT/DINOv3 — plus any Transformers classifier), and SAM/SAM2 segmentation using Hugging Face Transformers on Hugging Face Jobs cloud GPUs. Covers COCO-format dataset preparation, Albumentations augmentation, mAP/mAR evaluation, accuracy metrics, SAM segmentation with bbox/point prompts, DiceCE loss, hardware selection, cost estimation, Trackio monitoring, and Hub persistence. Use when users mention training object detection, image classification, SAM, SAM2, segmentation, image matting, DETR, D-FINE, RT-DETR, ViT, timm, MobileNet, ResNet, bounding box models, or fine-tuning vision models on Hugging Face Jobs.",{"claudeCode":263},"SKILL.md frontmatter at skills/huggingface-vision-trainer/SKILL.md",[634,635,637,638,640,642,643,645,646,647,649,651],{"path":323,"priority":308},{"path":636,"priority":435},"references/finetune_sam2_trainer.md",{"path":530,"priority":435},{"path":639,"priority":435},"references/image_classification_training_notebook.md",{"path":641,"priority":435},"references/object_detection_training_notebook.md",{"path":534,"priority":435},{"path":644,"priority":435},"references/timm_trainer.md",{"path":548,"priority":497},{"path":550,"priority":497},{"path":648,"priority":497},"scripts/image_classification_training.py",{"path":650,"priority":497},"scripts/object_detection_training.py",{"path":652,"priority":497},"scripts/sam_segmentation_training.py",{"basePath":252,"description":654,"displayName":12,"installMethods":655,"rationale":656,"selectedPaths":657,"source":314,"sourceLanguage":17,"type":476},"Train or fine-tune sentence-transformers models across `SentenceTransformer` (bi-encoder; dense or static embedding model; for retrieval, similarity, clustering, classification, paraphrase mining, dedup, multimodal), `CrossEncoder` (reranker; pair scoring for two-stage retrieval / pair classification), and `SparseEncoder` (SPLADE, sparse embedding model; for learned-sparse retrieval). Covers loss selection, hard-negative mining, evaluators, distillation, LoRA, Matryoshka, and Hugging Face Hub publishing. Use for any sentence-transformers training task.",{"claudeCode":263},"SKILL.md frontmatter at skills/train-sentence-transformers/SKILL.md",[658,659,661,663,665,667,669,670,672,674,676,678,680,682,684,685,687,689,691,693,695,697,699,701,703,705,707,709],{"path":323,"priority":308},{"path":660,"priority":435},"references/base_model_selection.md",{"path":662,"priority":435},"references/dataset_formats.md",{"path":664,"priority":435},"references/evaluators_cross_encoder.md",{"path":666,"priority":435},"references/evaluators_sentence_transformer.md",{"path":668,"priority":435},"references/evaluators_sparse_encoder.md",{"path":528,"priority":435},{"path":671,"priority":435},"references/hf_jobs_execution.md",{"path":673,"priority":435},"references/losses_cross_encoder.md",{"path":675,"priority":435},"references/losses_sentence_transformer.md",{"path":677,"priority":435},"references/losses_sparse_encoder.md",{"path":679,"priority":435},"references/model_architectures.md",{"path":681,"priority":435},"references/prompts_and_instructions.md",{"path":683,"priority":435},"references/training_args.md",{"path":542,"priority":435},{"path":686,"priority":497},"scripts/mine_hard_negatives.py",{"path":688,"priority":497},"scripts/train_cross_encoder_distillation_example.py",{"path":690,"priority":497},"scripts/train_cross_encoder_example.py",{"path":692,"priority":497},"scripts/train_cross_encoder_listwise_example.py",{"path":694,"priority":497},"scripts/train_sentence_transformer_distillation_example.py",{"path":696,"priority":497},"scripts/train_sentence_transformer_example.py",{"path":698,"priority":497},"scripts/train_sentence_transformer_make_multilingual_example.py",{"path":700,"priority":497},"scripts/train_sentence_transformer_matryoshka_example.py",{"path":702,"priority":497},"scripts/train_sentence_transformer_multi_dataset_example.py",{"path":704,"priority":497},"scripts/train_sentence_transformer_static_embedding_example.py",{"path":706,"priority":497},"scripts/train_sentence_transformer_with_lora_example.py",{"path":708,"priority":497},"scripts/train_sparse_encoder_distillation_example.py",{"path":710,"priority":497},"scripts/train_sparse_encoder_example.py",{"basePath":405,"description":712,"displayName":407,"installMethods":713,"rationale":714,"selectedPaths":715,"source":314,"sourceLanguage":17,"type":476},"Use Transformers.js to run state-of-the-art machine learning models directly in JavaScript/TypeScript. Supports NLP (text classification, translation, summarization), computer vision (image classification, object detection), audio (speech recognition, audio classification), and multimodal tasks. Works in browsers and server-side runtimes (Node.js, Bun, Deno) with WebGPU/WASM using pre-trained models from Hugging Face Hub.",{"claudeCode":263},"SKILL.md frontmatter at skills/transformers-js/SKILL.md",[716,717,719,721,723,725,727,729],{"path":323,"priority":308},{"path":718,"priority":435},"references/CACHE.md",{"path":720,"priority":435},"references/CONFIGURATION.md",{"path":722,"priority":435},"references/EXAMPLES.md",{"path":724,"priority":435},"references/MODEL_ARCHITECTURES.md",{"path":726,"priority":435},"references/MODEL_REGISTRY.md",{"path":728,"priority":435},"references/PIPELINE_OPTIONS.md",{"path":730,"priority":435},"references/TEXT_GENERATION.md",{"sources":732},[733],"manual",{"closedIssues90d":240,"description":735,"forks":241,"homepage":736,"license":246,"openIssues90d":242,"pushedAt":243,"readmeSize":238,"stars":244,"topics":737},"Give your agents the power of the Hugging Face ecosystem","https://huggingface.co",[],{"classifiedAt":739,"discoverAt":740,"extractAt":741,"githubAt":741,"updatedAt":739},1778690772996,1778689536128,1778690770714,[223,219,222,220,224,221],{"evaluatedAt":250,"extractAt":290,"updatedAt":250},[],[746,777,808,842,872],{"_creationTime":747,"_id":748,"community":749,"display":750,"identity":755,"providers":759,"relations":770,"tags":773,"workflow":774},1778675056600.203,"k17a3t49yvhb9wjth9qywk121x86nvmw",{"reviewCount":8},{"description":751,"installMethods":752,"name":753,"sourceUrl":754},"Autonomous experiment loop that optimizes any file by a measurable metric. 5 slash commands, 8 evaluators, configurable loop intervals (10min to monthly).",{"claudeCode":753},"autoresearch-agent","https://github.com/alirezarezvani/claude-skills",{"basePath":756,"githubOwner":757,"githubRepo":758,"locale":17,"slug":753,"type":255},"engineering/autoresearch-agent","alirezarezvani","claude-skills",{"evaluate":760,"extract":767},{"promptVersionExtension":212,"promptVersionScoring":213,"score":761,"tags":762,"targetMarket":225,"tier":226},100,[763,764,765,766,219],"optimization","experimentation","automation","code-quality",{"commitSha":280,"license":768,"plugin":769},"MIT",{"mcpCount":8,"provider":284,"skillCount":242},{"parentExtensionId":771,"repoId":772},"k17dce6sbramb6sxm7ksr3928x86ncfs","kd7ff9s1w43mfyy1n7hf87816186m6px",[765,766,764,219,763],{"evaluatedAt":775,"extractAt":776,"updatedAt":775},1778675384189,1778675056600,{"_creationTime":778,"_id":779,"community":780,"display":781,"identity":786,"providers":788,"relations":800,"tags":803,"workflow":804},1778693661691.4358,"k177fsagh49r77m9y4755zc1mn86m1jm",{"reviewCount":8},{"description":782,"installMethods":783,"name":784,"sourceUrl":785},"Make assistant output sound human. Strip AI-isms (sycophancy, stock vocab, hedging stacks, em-dash pileups), engineer burstiness, restore voice. Preserves code, URLs, and technical accuracy.",{"claudeCode":784},"unslop","https://github.com/MohamedAbdallah-14/unslop",{"basePath":266,"githubOwner":787,"githubRepo":784,"locale":17,"slug":784,"type":255},"MohamedAbdallah-14",{"evaluate":789,"extract":797},{"promptVersionExtension":212,"promptVersionScoring":213,"score":761,"tags":790,"targetMarket":225,"tier":226},[791,792,793,794,795,220,796],"ai","text","writing","editor","code","humanizer",{"commitSha":280,"plugin":798},{"mcpCount":8,"provider":284,"skillCount":799},5,{"parentExtensionId":801,"repoId":802},"k175vxsqnmn2ye2xkw62x4enkh86n8eb","kd727xcarpnqcat3wd68ms466s86mwkb",[791,795,794,796,220,792,793],{"evaluatedAt":805,"extractAt":806,"updatedAt":807},1778693722676,1778693661691,1778693923675,{"_creationTime":809,"_id":810,"community":811,"display":812,"identity":817,"providers":821,"relations":835,"tags":838,"workflow":839},1778696691708.2703,"k1702kbgkcgg2way9x5303rpr186n62a",{"reviewCount":8},{"description":813,"installMethods":814,"name":815,"sourceUrl":816},"Substrate plugin for Ruflo memory: AgentDB controller bridge (15 agentdb_* MCP tools), RuVector ONNX embeddings (10 embeddings_* tools incl. RaBitQ 32x quantization), and WASM HNSW pattern router (3 ruvllm_hnsw_* tools)",{"claudeCode":815},"ruflo-agentdb","https://github.com/ruvnet/ruflo",{"basePath":818,"githubOwner":819,"githubRepo":820,"locale":17,"slug":815,"type":255},"plugins/ruflo-agentdb","ruvnet","ruflo",{"evaluate":822,"extract":832},{"promptVersionExtension":212,"promptVersionScoring":213,"score":823,"tags":824,"targetMarket":225,"tier":831},97,[825,223,826,827,828,829,830],"memory","vector-search","agentdb","onnx","hnsw","quantization","community",{"commitSha":280,"license":768,"plugin":833},{"mcpCount":8,"provider":284,"skillCount":834},2,{"parentExtensionId":836,"repoId":837},"k1753196a11bz5jzm7hqzasr0h86nk71","kd7ed28gj8n0y3msk5dzrp05zs86nqtc",[827,223,829,825,828,830,826],{"evaluatedAt":840,"extractAt":841,"updatedAt":840},1778696878749,1778696691708,{"_creationTime":843,"_id":844,"community":845,"display":846,"identity":851,"providers":856,"relations":864,"tags":867,"workflow":868},1778698685517.993,"k176kzhvkthzzw7v2hzx63b98n86n3h5",{"reviewCount":8},{"description":847,"installMethods":848,"name":849,"sourceUrl":850},"Data engineering, ML, and AI specialists - data pipelines, machine learning, LLM architecture",{"claudeCode":849},"voltagent-data-ai","https://github.com/VoltAgent/awesome-claude-code-subagents",{"basePath":852,"githubOwner":853,"githubRepo":854,"locale":17,"slug":855,"type":255},"categories/05-data-ai","VoltAgent","awesome-claude-code-subagents","05-data-ai",{"evaluate":857,"extract":863},{"promptVersionExtension":212,"promptVersionScoring":213,"score":823,"tags":858,"targetMarket":225,"tier":226},[859,219,791,860,861,862,220],"data-engineering","llm","data-pipelines","database",{"commitSha":280,"license":768},{"parentExtensionId":865,"repoId":866},"k177jy83mgmpwtgzjm9qkv26mn86mmcm","kd7414ztrsh3tvh3e4bh6672qn86ne05",[791,859,861,862,860,219,220],{"evaluatedAt":869,"extractAt":870,"updatedAt":871},1778698776282,1778698685518,1778699002170,{"_creationTime":873,"_id":874,"community":875,"display":876,"identity":878,"providers":879,"relations":889,"tags":890,"workflow":891},1778690773482.4854,"k17745362t936z67p0p8w8mq0h86nmf0",{"reviewCount":8},{"description":406,"installMethods":877,"name":407,"sourceUrl":13},{"claudeCode":407},{"basePath":405,"githubOwner":253,"githubRepo":254,"locale":17,"slug":407,"type":255},{"evaluate":880,"extract":888},{"promptVersionExtension":212,"promptVersionScoring":213,"score":881,"tags":882,"targetMarket":225,"tier":226},96,[219,883,884,220,885,886,887],"javascript","typescript","computer-vision","audio","multimodal",{"commitSha":280},{"parentExtensionId":258,"repoId":286},[886,885,883,219,887,220,884],{"evaluatedAt":892,"extractAt":290,"updatedAt":892},1778691120894]