Skip to main content

LLM Council

Skill Verified Active
Part of:Pro Workflow

Provider-agnostic multi-LLM deliberation. Three phases — independent responses, cross-model anonymized ranking, chairman synthesis. Provider config from env (OPENAI/ANTHROPIC/FIREWORKS/OPENROUTER/custom OpenAI-compatible base URL). Persists transcript to a wiki page when --wiki <slug> is passed. Use when the user wants multiple AI perspectives, consensus-building, or the "LLM Council" approach for high-stakes reviews, plan critique, or contested learning rules.

Purpose

To enable users to leverage multiple AI perspectives for high-stakes reviews, consensus-building, or critical decision-making by facilitating a structured multi-LLM deliberation process.

Features

  • Provider-agnostic multi-LLM deliberation
  • Three-phase process (independent, ranking, synthesis)
  • Supports OpenAI, Anthropic, OpenRouter, Fireworks, and custom providers via env vars
  • Persists transcripts to wiki pages
  • Configurable model rosters and chairman selection

Use Cases

  • Gaining multiple AI perspectives for complex problems
  • Building consensus on critical decisions or plan critiques
  • Utilizing the 'LLM Council' approach for high-stakes reviews
  • Creating persistent, searchable wiki pages from AI deliberations

Non-Goals

  • Acting as a direct replacement for individual LLM API calls without the deliberation framework
  • Providing pre-defined AI perspectives without user-defined queries
  • Managing LLM provider accounts or billing

Prerequisites

  • LLM API keys configured via environment variables (e.g., ANTHROPIC_API_KEY, OPENAI_API_KEY)
  • Node.js runtime

Installation

First, add the marketplace

/plugin marketplace add rohitg00/pro-workflow
/plugin install pro-workflow@pro-workflow

Quality Score

Verified
99 /100
Analyzed about 22 hours ago

Trust Signals

Last commit3 days ago
Stars2.1k
LicenseMIT
Status
View Source

Similar Extensions

Oh My Claudecode

100

Process-first advisor routing for Claude, Codex, or Gemini via `omc ask`, with artifact capture and no raw CLI assembly

Skill
Yeachan-Heo

Lark Wiki CLI

100

飞书知识库:管理知识空间、空间成员和文档节点。创建和查询知识空间、查看和管理空间成员、管理节点层级结构、在知识库中组织文档和快捷方式。当用户需要在知识库中查找或创建文档、浏览知识空间结构、查看或管理空间成员、移动或复制节点时使用。

Skill
larksuite

Rag Architect

100

Use when the user asks to design RAG pipelines, optimize retrieval strategies, choose embedding models, implement vector search, or build knowledge retrieval systems.

Skill
alirezarezvani

LLM Wiki

100

Use when building or maintaining a persistent personal knowledge base (second brain) in Obsidian where an LLM incrementally ingests sources, updates entity/concept pages, maintains cross-references, and keeps a synthesis current. Triggers include "second brain", "Obsidian wiki", "personal knowledge management", "ingest this paper/article/book", "build a research wiki", "compound knowledge", "Memex", or whenever the user wants knowledge to accumulate across sessions instead of being re-derived by RAG on every query.

Skill
alirezarezvani

Recursive Research

100

Investigación recursiva profunda con loop auto-regulado hasta nivel PhD. Aplicable a cualquier dominio (ciencia, tecnología, negocio, arte, humanidades). Usa WDM + Inversión Munger para decisiones autónomas, tiering de fuentes confiables, y checkpointing a disco para sobrevivir límites de contexto.

Skill
Anjos2

Understand Knowledge

100

Analyze a Karpathy-pattern LLM wiki knowledge base and generate an interactive knowledge graph with entity extraction, implicit relationships, and topic clustering.

Skill
Lum1104

© 2025 SkillRepo · Find the right skill, skip the noise.