Ruflo Ruvllm
Plugin Verifiziert AktivRuVLLM local inference with chat formatting (Claude/GPT/Gemini/Ollama/Cohere), model configuration, MicroLoRA fine-tuning, and SONA real-time adaptation
To enable users to run and fine-tune large language models locally with advanced features for optimal performance and integration into RAG pipelines.
Funktionen
- Local LLM inference with RuVLLM
- Model configuration and optimization
- MicroLoRA task-specific fine-tuning
- SONA real-time adaptation
- Multi-provider chat formatting (Claude, GPT, Gemini, Ollama, Cohere)
- HNSW routing for RAG context retrieval
Anwendungsfälle
- Configuring optimal local LLM models for specific tasks.
- Fine-tuning LLMs with lightweight adapters (MicroLoRA) for specialized domains.
- Implementing real-time adaptation (SONA) for continuous feedback loops.
- Preparing prompts for various LLM providers and integrating RAG context.
Nicht-Ziele
- Cloud-based LLM inference
- Replacing core Claude Code functionality
- General-purpose system administration tools
Installation
Zuerst Marketplace hinzufügen
/plugin marketplace add ruvnet/ruflo/plugin install ruflo-ruvllm@rufloQualitätspunktzahl
VerifiziertVertrauenssignale
Ähnliche Erweiterungen
Microsoft Learn MCP Server
100Greifen Sie auf offizielle Microsoft-Dokumentationen, API-Referenzen und Codebeispiele für Azure, .NET, Windows und mehr zu.
Build with Claude
99Docker-based MCP servers from the official Docker MCP registry - includes 199+ verified servers
Ruflo Rag Memory
99RuVector memory with HNSW search, AgentDB, and semantic retrieval
Brave Search Skills
99Offizielle Brave Search API-Skills für KI-Codierungsagenten
Build with Claude
98Complete collection of 117 specialized AI agents across 11 categories
Fp Check
97Systematic false positive verification for security bug analysis with mandatory gate reviews