Zum Hauptinhalt springen
Dieser Inhalt ist noch nicht in Ihrer Sprache verfügbar und wird auf Englisch angezeigt.

LLM Council

Skill Verifiziert Aktiv
Teil von:Pro Workflow

Provider-agnostic multi-LLM deliberation. Three phases — independent responses, cross-model anonymized ranking, chairman synthesis. Provider config from env (OPENAI/ANTHROPIC/FIREWORKS/OPENROUTER/custom OpenAI-compatible base URL). Persists transcript to a wiki page when --wiki <slug> is passed. Use when the user wants multiple AI perspectives, consensus-building, or the "LLM Council" approach for high-stakes reviews, plan critique, or contested learning rules.

Zweck

To enable users to leverage multiple AI perspectives for high-stakes reviews, consensus-building, or critical decision-making by facilitating a structured multi-LLM deliberation process.

Funktionen

  • Provider-agnostic multi-LLM deliberation
  • Three-phase process (independent, ranking, synthesis)
  • Supports OpenAI, Anthropic, OpenRouter, Fireworks, and custom providers via env vars
  • Persists transcripts to wiki pages
  • Configurable model rosters and chairman selection

Anwendungsfälle

  • Gaining multiple AI perspectives for complex problems
  • Building consensus on critical decisions or plan critiques
  • Utilizing the 'LLM Council' approach for high-stakes reviews
  • Creating persistent, searchable wiki pages from AI deliberations

Nicht-Ziele

  • Acting as a direct replacement for individual LLM API calls without the deliberation framework
  • Providing pre-defined AI perspectives without user-defined queries
  • Managing LLM provider accounts or billing

Voraussetzungen

  • LLM API keys configured via environment variables (e.g., ANTHROPIC_API_KEY, OPENAI_API_KEY)
  • Node.js runtime

Installation

Zuerst Marketplace hinzufügen

/plugin marketplace add rohitg00/pro-workflow
/plugin install pro-workflow@pro-workflow

Qualitätspunktzahl

Verifiziert
99 /100
Analysiert about 21 hours ago

Vertrauenssignale

Letzter Commit3 days ago
Sterne2.1k
LizenzMIT
Status
Quellcode ansehen

Ähnliche Erweiterungen

Oh My Claudecode

100

Process-first advisor routing for Claude, Codex, or Gemini via `omc ask`, with artifact capture and no raw CLI assembly

Skill
Yeachan-Heo

Lark Wiki CLI

100

飞书知识库:管理知识空间、空间成员和文档节点。创建和查询知识空间、查看和管理空间成员、管理节点层级结构、在知识库中组织文档和快捷方式。当用户需要在知识库中查找或创建文档、浏览知识空间结构、查看或管理空间成员、移动或复制节点时使用。

Skill
larksuite

Rag Architect

100

Use when the user asks to design RAG pipelines, optimize retrieval strategies, choose embedding models, implement vector search, or build knowledge retrieval systems.

Skill
alirezarezvani

LLM Wiki

100

Use when building or maintaining a persistent personal knowledge base (second brain) in Obsidian where an LLM incrementally ingests sources, updates entity/concept pages, maintains cross-references, and keeps a synthesis current. Triggers include "second brain", "Obsidian wiki", "personal knowledge management", "ingest this paper/article/book", "build a research wiki", "compound knowledge", "Memex", or whenever the user wants knowledge to accumulate across sessions instead of being re-derived by RAG on every query.

Skill
alirezarezvani

Recursive Research

100

Tiefgehende rekursive Recherche mit selbstregulierender Schleife bis PhD-Niveau. Anwendbar auf jedes Gebiet (Wissenschaft, Technologie, Wirtschaft, Kunst, Geisteswissenschaften). Nutzt WDM + Munger-Inversion für autonome Entscheidungen, Stufung vertrauenswürdiger Quellen und Speicherung von Checkpoints auf der Festplatte, um Kontextgrenzen zu überwinden.

Skill
Anjos2

Understand Knowledge

100

Analysieren Sie eine LLM-Wiki-Wissensdatenbank nach dem Karpathy-Muster und generieren Sie einen interaktiven Wissensgraphen mit Entitätsextraktion, impliziten Beziehungen und Themenclustern.

Skill
Lum1104