Implementing Warehouse Sources
Skill ActiveImplement and extend PostHog Data warehouse import sources. Use when adding a new source under posthog/temporal/data_imports/sources, adding datasets/endpoints to an existing source, or adding incremental sync, resumable imports, webhook ingestion, pagination, credentials validation, and source tests.
To guide developers in building and extending data warehouse import sources for PostHog, enabling seamless data integration from various external tools.
Features
- Extends PostHog Data warehouse import sources
- Adds new sources, datasets, and endpoints
- Supports incremental sync, resumable imports, and webhook ingestion
- Provides guidance on credentials validation and source tests
- Details base class selection and end-to-end workflow
Use Cases
- Adding a new source under `posthog/temporal/data_imports/sources/`
- Adding datasets/endpoints to an existing source
- Implementing incremental sync or resumable imports for APIs
- Integrating webhook ingestion for event-driven data updates
Non-Goals
- Implementing actual data sources directly
- Acting as a generic data pipeline tool
- Replacing the need for understanding the target API's documentation
Trust
- warning:Issues AttentionThe repository has a high number of open issues (544) compared to closed issues (163) in the last 90 days, indicating slow response times or a backlog.
Practical Utility
- info:Usage examplesThe SKILL.md provides extensive inline code examples for implementation patterns and structure, though not end-to-end runnable examples of specific sources.
Installation
npx skills add PostHog/posthogRuns the Vercel skills CLI (skills.sh) via npx — needs Node.js locally and at least one installed skills-compatible agent (Claude Code, Cursor, Codex, …). Assumes the repo follows the agentskills.io format.
Quality Score
Trust Signals
Similar Extensions
Senior Data Engineer
95Data engineering skill for building scalable data pipelines, ETL/ELT systems, and data infrastructure. Expertise in Python, SQL, Spark, Airflow, dbt, Kafka, and modern data stack. Includes data modeling, pipeline orchestration, data quality, and DataOps. Use when designing data architectures, building data pipelines, optimizing data workflows, implementing data governance, or troubleshooting data issues.
Suggesting Data Imports
76Use when the user asks about revenue, payments, subscriptions, billing, CRM deals, support tickets, production database tables, or other data that PostHog does not collect natively. Also use when a query fails because a table does not exist or returns no results for expected external data. The data warehouse can import from SaaS tools (Stripe, Hubspot, etc.), production databases (Postgres, MySQL, BigQuery, Snowflake), and other arbitrary data sources. Covers checking existing sources, identifying the right source type, and guiding the setup.
Typescript Advanced Types
100Master TypeScript's advanced type system including generics, conditional types, mapped types, template literals, and utility types for building type-safe applications. Use when implementing complex type logic, creating reusable type utilities, or ensuring compile-time type safety in TypeScript projects.
Validate Plugin
100Validate a Claude Code plugin structure, frontmatter, and MCP tool references
Migrate Validate
100Validate pending migrations for foreign key consistency, rollback safety, and best practices
Lean Ctx
100Context Runtime for AI Agents — 59 MCP tools, 10 read modes, 95+ shell patterns, tree-sitter AST for 18 languages. Compresses LLM context by up to 99%. Use when reading files, running shell commands, searching code, or exploring directories. Auto-installs if not present.