跳转到主要内容
此内容尚未提供您的语言版本,正在以英文显示。

Ruflo Ruvllm

插件 已验证 活跃
属于:Ruflo

RuVLLM local inference with chat formatting (Claude/GPT/Gemini/Ollama/Cohere), model configuration, MicroLoRA fine-tuning, and SONA real-time adaptation

2 个 Skill 0 个 MCP
目的

To enable users to run and fine-tune large language models locally with advanced features for optimal performance and integration into RAG pipelines.

功能

  • Local LLM inference with RuVLLM
  • Model configuration and optimization
  • MicroLoRA task-specific fine-tuning
  • SONA real-time adaptation
  • Multi-provider chat formatting (Claude, GPT, Gemini, Ollama, Cohere)
  • HNSW routing for RAG context retrieval

使用场景

  • Configuring optimal local LLM models for specific tasks.
  • Fine-tuning LLMs with lightweight adapters (MicroLoRA) for specialized domains.
  • Implementing real-time adaptation (SONA) for continuous feedback loops.
  • Preparing prompts for various LLM providers and integrating RAG context.

非目标

  • Cloud-based LLM inference
  • Replacing core Claude Code functionality
  • General-purpose system administration tools

安装

请先添加 Marketplace

/plugin marketplace add ruvnet/ruflo
/plugin install ruflo-ruvllm@ruflo

质量评分

已验证
98 /100
about 24 hours ago 分析

信任信号

最近提交1 day ago
星标50.2k
许可证MIT
状态
查看源代码