跳转到主要内容
此内容尚未提供您的语言版本,正在以英文显示。

Train Sentence Transformers

插件 已验证 活跃

Train or fine-tune sentence-transformers models across all three architectures: SentenceTransformer (bi-encoder embeddings), CrossEncoder (rerankers), and SparseEncoder (SPLADE). Covers loss selection, hard-negative mining, evaluators, distillation, LoRA, Matryoshka, and Hugging Face Hub publishing.

目的

To provide a structured and comprehensive system for users to train or fine-tune sentence-transformers models across various architectures and techniques, simplifying complex ML workflows.

功能

  • Supports SentenceTransformer, CrossEncoder, and SparseEncoder architectures
  • Covers loss selection, hard-negative mining, and evaluators
  • Includes guidance on LoRA, Matryoshka, and distillation
  • Facilitates Hugging Face Hub publishing
  • Provides production-ready example scripts and detailed references

使用场景

  • Training sentence-transformers for retrieval, similarity search, or clustering.
  • Fine-tuning models for specific downstream tasks like classification or reranking.
  • Implementing SPLADE models for sparse retrieval systems.
  • Exploring advanced training techniques like LoRA or distillation.

非目标

  • Synthesizing training scripts from scratch without using provided templates.
  • Replacing the core Hugging Face `transformers` or `sentence-transformers` libraries.
  • Providing a GUI for model training.

安装

请先添加 Marketplace

/plugin marketplace add huggingface/skills
/plugin install train-sentence-transformers@huggingface-skills

质量评分

已验证
99 /100
1 day ago 分析

信任信号

最近提交2 days ago
星标10.5k
许可证Apache-2.0
状态
查看源代码