[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"extension-skill-clickhouse-chdb-datastore-en":3,"guides-for-clickhouse-chdb-datastore":450,"similar-k175fytrqb2bcz2bg505w804j986m6he-en":451},{"_creationTime":4,"_id":5,"children":6,"community":7,"display":9,"evaluation":15,"identity":242,"isFallback":232,"parentExtension":246,"providers":300,"relations":304,"repo":305,"tags":448,"workflow":449},1778683910609.901,"k175fytrqb2bcz2bg505w804j986m6he",[],{"reviewCount":8},0,{"description":10,"installMethods":11,"name":13,"sourceUrl":14},"Drop-in pandas replacement with ClickHouse performance. Use `import chdb.datastore as pd` (or `from datastore import DataStore`) and write standard pandas code — same API, 10-100x faster on large datasets. Supports 16+ data sources (MySQL, PostgreSQL, S3, MongoDB, ClickHouse, Iceberg, Delta Lake, etc.) and 10+ file formats (Parquet, CSV, JSON, Arrow, ORC, etc.) with cross-source joins. Use this skill when the user wants to analyze data with pandas-style syntax, speed up slow pandas code, query remote databases or cloud storage as DataFrames, or join data across different sources — even if they don't explicitly mention chdb or DataStore. Do NOT use for raw SQL queries, ClickHouse server administration, or non-Python languages.",{"claudeCode":12},"clickhouse/agent-skills","chdb-datastore","https://github.com/clickhouse/agent-skills",{"_creationTime":16,"_id":17,"extensionId":5,"locale":18,"result":19,"trustSignals":223,"workflow":240},1778684010861.2068,"kn75sexfhd06pwpsg0w3pvympd86nne4","en",{"checks":20,"evaluatedAt":190,"extensionSummary":191,"features":192,"nonGoals":198,"promptVersionExtension":202,"promptVersionScoring":203,"purpose":204,"rationale":205,"score":206,"summary":207,"tags":208,"targetMarket":216,"tier":217,"useCases":218},[21,26,29,32,36,39,44,48,50,53,57,61,64,68,71,74,77,80,83,86,90,94,99,103,107,110,113,116,120,123,126,129,131,133,136,140,144,148,151,155,158,161,164,167,171,174,177,180,183,187],{"category":22,"check":23,"severity":24,"summary":25},"Practical Utility","Problem relevance","pass","The description clearly states the problem of slow pandas performance and the need to analyze data from various sources, directly addressing user pain points.",{"category":22,"check":27,"severity":24,"summary":28},"Unique selling proposition","The skill offers a significant performance improvement over standard pandas by leveraging ClickHouse and supports a wide range of data sources, providing value beyond basic pandas functionality.",{"category":22,"check":30,"severity":24,"summary":31},"Production readiness","The skill provides a complete pandas API replacement, supports numerous data sources, and includes installation and usage examples, making it suitable for real-world workflows.",{"category":33,"check":34,"severity":24,"summary":35},"Scope","Single responsibility principle","The skill focuses on providing a pandas-like interface for data analysis across various sources, with clear boundaries against raw SQL queries or ClickHouse server administration.",{"category":33,"check":37,"severity":24,"summary":38},"Description quality","The description accurately reflects the functionality of the skill, clearly outlining its purpose, capabilities, supported data sources, and intended use cases.",{"category":40,"check":41,"severity":42,"summary":43},"Invocation","Scoped tools","not_applicable","This is a skill, not an MCP extension, and does not expose individual tools.",{"category":45,"check":46,"severity":24,"summary":47},"Documentation","Configuration & parameter reference","The API reference and connectors documentation comprehensively detail connection methods, parameters, and available methods.",{"category":33,"check":49,"severity":42,"summary":43},"Tool naming",{"category":33,"check":51,"severity":24,"summary":52},"Minimal I/O surface","The skill's interface is the pandas API, which is well-defined. DataStore operations return DataFrames or Series, as expected.",{"category":54,"check":55,"severity":24,"summary":56},"License","License usability","The extension is licensed under Apache-2.0, a permissive open-source license, as indicated in the SKILL.md frontmatter and the LICENSE file.",{"category":58,"check":59,"severity":24,"summary":60},"Maintenance","Commit recency","The latest commit was on 2026-05-13, indicating recent maintenance.",{"category":58,"check":62,"severity":24,"summary":63},"Dependency Management","The project uses pip and a lockfile (hasLockfile: true) for dependency management.",{"category":65,"check":66,"severity":24,"summary":67},"Security","Secret Management","The skill is designed to connect to data sources which may require credentials, but there's no indication of secrets being hardcoded or leaked. Connection details are passed as parameters or via environment variables.",{"category":65,"check":69,"severity":24,"summary":70},"Injection","The skill interacts with data sources and files, but the provided documentation and examples suggest a structured approach to data handling rather than arbitrary code execution.",{"category":65,"check":72,"severity":24,"summary":73},"Transitive Supply-Chain Grenades","The skill relies on the `chdb` library, which is installed via pip. There is no indication of runtime fetching of code or arbitrary remote execution.",{"category":65,"check":75,"severity":24,"summary":76},"Sandbox Isolation","The skill primarily interacts with data sources and files. Operations appear to be confined to the scope of data retrieval and manipulation within those sources.",{"category":65,"check":78,"severity":24,"summary":79},"Sandbox escape primitives","No evidence of detached processes or retry loops around denied calls was found in the provided scripts or documentation.",{"category":65,"check":81,"severity":24,"summary":82},"Data Exfiltration","The skill handles data connections but does not instruct the agent to submit confidential data to third parties. Credentials are passed to the data source connection functions.",{"category":65,"check":84,"severity":24,"summary":85},"Hidden Text Tricks","The bundled files and documentation do not contain any hidden text tricks or obfuscated instructions.",{"category":87,"check":88,"severity":24,"summary":89},"Hooks","Opaque code execution","The provided scripts are written in plain Python and do not use obfuscation techniques like base64 encoding or runtime code fetching.",{"category":91,"check":92,"severity":24,"summary":93},"Portability","Structural Assumption","The skill operates on data sources specified by the user; it does not make assumptions about project structure beyond the paths provided.",{"category":95,"check":96,"severity":97,"summary":98},"Trust","Issues Attention","info","22 issues opened, 0 closed in the last 90 days, indicating slow response times from maintainers.",{"category":100,"check":101,"severity":24,"summary":102},"Versioning","Release Management","The skill has a declared version '4.1' in the SKILL.md frontmatter.",{"category":104,"check":105,"severity":24,"summary":106},"Code Execution","Validation","The underlying `chdb` library likely handles validation of connection parameters and data structures. The skill's interface is the pandas API, which has its own type checking.",{"category":65,"check":108,"severity":24,"summary":109},"Unguarded Destructive Operations","The skill is primarily for data analysis and querying; destructive operations like data deletion are not part of its core functionality and thus not unguarded.",{"category":104,"check":111,"severity":24,"summary":112},"Error Handling","The skill leverages the `chdb` library which is expected to handle errors gracefully. Connection errors and query failures would be reported via exceptions.",{"category":104,"check":114,"severity":42,"summary":115},"Logging","The skill itself does not appear to implement custom logging beyond what the underlying `chdb` library or the agent framework provides.",{"category":117,"check":118,"severity":97,"summary":119},"Compliance","GDPR","The skill operates on user-provided data sources, which may contain personal data. No explicit sanitization is mentioned, but data is not sent to third parties without user action.",{"category":117,"check":121,"severity":24,"summary":122},"Target market","The skill's functionality is general and does not appear to be restricted to any specific geographic or legal jurisdiction. `targetMarket` is set to 'global'.",{"category":91,"check":124,"severity":24,"summary":125},"Runtime stability","The skill requires Python 3.9+ and can run on macOS or Linux, as stated in the compatibility requirements. It does not appear to have other OS-specific dependencies.",{"category":45,"check":127,"severity":24,"summary":128},"README","A README.md file exists and clearly states the extension's purpose, installation, and included components.",{"category":33,"check":130,"severity":42,"summary":43},"Tool surface size",{"category":40,"check":132,"severity":42,"summary":43},"Overlapping near-synonym tools",{"category":45,"check":134,"severity":24,"summary":135},"Phantom features","All advertised features, such as pandas API compatibility, data source support, and cross-source joins, are supported by the extensive documentation and examples.",{"category":137,"check":138,"severity":24,"summary":139},"Install","Installation instruction","Installation instructions are provided in the README and SKILL.md, including a copy-pasteable `npx skills add` command and a verification script.",{"category":141,"check":142,"severity":24,"summary":143},"Errors","Actionable error messages","The troubleshooting section in SKILL.md provides actionable advice for common errors, including file not found, connection timeouts, and join issues.",{"category":145,"check":146,"severity":24,"summary":147},"Execution","Pinned dependencies","The project uses pip for dependency management, and the `hasLockfile` trust signal indicates that dependencies are pinned.",{"category":33,"check":149,"severity":42,"summary":150},"Dry-run preview","The skill is primarily for data analysis and querying; it does not perform state-changing operations that would require a dry-run mode.",{"category":152,"check":153,"severity":24,"summary":154},"Protocol","Idempotent retry & timeouts","The underlying `chdb` library is expected to handle connection timeouts and retries appropriately for data source interactions.",{"category":117,"check":156,"severity":24,"summary":157},"Telemetry opt-in","There is no mention of telemetry collection in the documentation. It is assumed to be off by default.",{"category":40,"check":159,"severity":24,"summary":160},"Precise Purpose","The description clearly defines the artifact (pandas API replacement) and the user intent (analyze data, speed up code, query remote sources), with explicit non-goals.",{"category":40,"check":162,"severity":24,"summary":163},"Concise Frontmatter","The frontmatter is concise, self-contained, and accurately summarizes the core capability and trigger phrases.",{"category":45,"check":165,"severity":24,"summary":166},"Concise Body","The SKILL.md is concise and delegates detailed information to separate reference files.",{"category":168,"check":169,"severity":24,"summary":170},"Context","Progressive Disclosure","Detailed information about API reference and connectors is provided in separate markdown files, demonstrating good progressive disclosure.",{"category":168,"check":172,"severity":42,"summary":173},"Forked exploration","The skill is not an exploration or audit-style skill and does not require `context: fork`.",{"category":22,"check":175,"severity":24,"summary":176},"Usage examples","Sufficient runnable examples are provided, covering various data sources, joins, and common operations, with expected outputs.",{"category":22,"check":178,"severity":24,"summary":179},"Edge cases","The troubleshooting section addresses common issues like file not found, connection errors, and join type mismatches with recovery steps.",{"category":104,"check":181,"severity":42,"summary":182},"Tool Fallback","The skill does not rely on external MCP servers; it uses the locally installed `chdb` library.",{"category":184,"check":185,"severity":24,"summary":186},"Safety","Halt on unexpected state","The troubleshooting section guides users on how to handle potential issues like missing files or incorrect join keys, implying a halt on unexpected states.",{"category":91,"check":188,"severity":24,"summary":189},"Cross-skill coupling","The skill is self-contained and does not rely on other specific skills being loaded in the same session.",1778684010455,"This skill provides a drop-in replacement for the pandas library, named `chdb.datastore`, which leverages ClickHouse for significantly faster data analysis on large datasets. It supports numerous data sources and file formats, enabling cross-source joins and acting as a powerful tool for accelerating existing pandas workflows.",[193,194,195,196,197],"Drop-in replacement for pandas API","10-100x faster performance","Connects to 16+ data sources (databases, cloud storage, files)","Supports 10+ file formats (Parquet, CSV, JSON, etc.)","Performs cross-source joins seamlessly",[199,200,201],"Performing raw SQL queries (use chdb-sql skill)","ClickHouse server administration","Usage in non-Python languages","3.0.0","4.4.0","To enable users to perform data analysis with familiar pandas syntax but at ClickHouse speeds, and to easily query and join data from diverse sources.","The skill is well-documented, provides clear examples, and addresses a common pain point of slow data analysis. The only minor finding is slow issue response times from maintainers.",95,"A high-quality, performant pandas replacement for data analysis and integration.",[209,210,211,212,213,214,215],"data-analysis","pandas","clickhouse","sql","dataframe","etl","data-integration","global","verified",[219,220,221,222],"Analyzing large datasets with pandas-style syntax","Speeding up slow pandas code","Querying remote databases or cloud storage as DataFrames","Joining data across different sources (e.g., database table and parquet file)",{"codeQuality":224,"collectedAt":226,"documentation":227,"maintenance":230,"security":237,"testCoverage":239},{"hasLockfile":225},true,1778683988122,{"descriptionLength":228,"readmeSize":229},735,6756,{"closedIssues90d":8,"forks":231,"hasChangelog":232,"manifestVersion":233,"openIssues90d":234,"pushedAt":235,"stars":236},25,false,"4.1",2,1778669462000,425,{"hasNpmPackage":232,"license":238,"smitheryVerified":232},"Apache-2.0",{"hasCi":225,"hasTests":232},{"updatedAt":241},1778684010861,{"basePath":243,"githubOwner":211,"githubRepo":244,"locale":18,"slug":13,"type":245},"skills/chdb-datastore","agent-skills","skill",{"_creationTime":247,"_id":248,"community":249,"display":250,"identity":254,"parentExtension":257,"providers":285,"relations":295,"tags":296,"workflow":297},1778683910609.9004,"k171w0wat3qnkfpas7mn7yqtb986mfgf",{"reviewCount":8},{"description":251,"installMethods":252,"name":253,"sourceUrl":14},"28 best practice rules for ClickHouse schema design, query optimization, and data ingestion — prioritized by impact",{"claudeCode":253},"clickhouse-best-practices",{"basePath":255,"githubOwner":211,"githubRepo":244,"locale":18,"slug":244,"type":256},"","plugin",{"_creationTime":258,"_id":259,"community":260,"display":261,"identity":265,"providers":267,"relations":278,"tags":280,"workflow":281},1778683910609.9001,"k1790kh9nnyedb58t0bhb9k83s86mcna",{"reviewCount":8},{"description":262,"installMethods":263,"name":264,"sourceUrl":14},"Official ClickHouse best practices for Claude Code",{"claudeCode":12},"clickhouse-agent-skills",{"basePath":255,"githubOwner":211,"githubRepo":244,"locale":18,"slug":244,"type":266},"marketplace",{"evaluate":268,"extract":273},{"promptVersionExtension":269,"promptVersionScoring":203,"score":206,"tags":270,"targetMarket":216,"tier":217},"3.1.0",[211,271,212,209,272],"database","developer-tools",{"commitSha":274,"marketplace":275,"plugin":276},"HEAD",{"name":264,"pluginCount":234},{"mcpCount":8,"provider":277,"skillCount":8},"classify",{"repoId":279},"kd7723v6kvsmj7pd0jntz17bkn86ne4f",[211,209,271,272,212],{"evaluatedAt":282,"extractAt":283,"updatedAt":284},1778683929817,1778683910609,1778684301942,{"evaluate":286,"extract":292},{"promptVersionExtension":202,"promptVersionScoring":203,"score":287,"tags":288,"targetMarket":216,"tier":217},97,[211,271,212,289,290,291],"python","devops","analytics",{"commitSha":274,"license":238,"plugin":293},{"mcpCount":8,"provider":277,"skillCount":294},6,{"parentExtensionId":259,"repoId":279},[291,211,271,290,289,212],{"evaluatedAt":298,"extractAt":283,"updatedAt":299},1778683955196,1778684302148,{"evaluate":301,"extract":303},{"promptVersionExtension":202,"promptVersionScoring":203,"score":206,"tags":302,"targetMarket":216,"tier":217},[209,210,211,212,213,214,215],{"commitSha":274},{"parentExtensionId":248,"repoId":279},{"_creationTime":306,"_id":279,"identity":307,"providers":308,"workflow":444},1778683905800.361,{"githubOwner":211,"githubRepo":244,"sourceUrl":14},{"classify":309,"discover":436,"github":439},{"commitSha":274,"extensions":310},[311,324,345,355,370,386,403,412,420,428],{"basePath":255,"description":262,"displayName":264,"installMethods":312,"rationale":313,"selectedPaths":314,"source":323,"sourceLanguage":18,"type":266},{"claudeCode":12},"marketplace.json at .claude-plugin/marketplace.json",[315,318,320],{"path":316,"priority":317},".claude-plugin/marketplace.json","mandatory",{"path":319,"priority":317},"README.md",{"path":321,"priority":322},"LICENSE","high","rule",{"basePath":255,"description":251,"displayName":253,"installMethods":325,"license":238,"rationale":326,"selectedPaths":327,"source":323,"sourceLanguage":18,"type":256},{"claudeCode":253},"plugin manifest at .claude-plugin/plugin.json",[328,330,331,332,335,337,339,341,343],{"path":329,"priority":317},".claude-plugin/plugin.json",{"path":319,"priority":317},{"path":321,"priority":322},{"path":333,"priority":334},"skills/chdb-datastore/SKILL.md","medium",{"path":336,"priority":334},"skills/chdb-sql/SKILL.md",{"path":338,"priority":334},"skills/clickhouse-architecture-advisor/SKILL.md",{"path":340,"priority":334},"skills/clickhouse-best-practices/SKILL.md",{"path":342,"priority":334},"skills/clickhousectl-cloud-deploy/SKILL.md",{"path":344,"priority":334},"skills/clickhousectl-local-dev/SKILL.md",{"basePath":346,"description":347,"displayName":348,"installMethods":349,"rationale":350,"selectedPaths":351,"source":323,"sourceLanguage":18,"type":256},"skills/clickhouse-architecture-advisor","Workload-aware architecture decision skill for ClickHouse — ingestion strategies, partitioning, enrichment, upsert patterns, and pre-aggregation with explicit official/derived/field provenance","clickhouse-architecture-advisor",{"claudeCode":348},"inline plugin source from marketplace.json at skills/clickhouse-architecture-advisor",[352,353],{"path":319,"priority":317},{"path":354,"priority":322},"SKILL.md",{"basePath":243,"description":10,"displayName":13,"installMethods":356,"rationale":357,"selectedPaths":358,"source":323,"sourceLanguage":18,"type":245},{"claudeCode":12},"SKILL.md frontmatter at skills/chdb-datastore/SKILL.md",[359,360,361,364,366,368],{"path":354,"priority":317},{"path":319,"priority":322},{"path":362,"priority":363},"examples/examples.md","low",{"path":365,"priority":334},"references/api-reference.md",{"path":367,"priority":334},"references/connectors.md",{"path":369,"priority":363},"scripts/verify_install.py",{"basePath":371,"description":372,"displayName":373,"installMethods":374,"rationale":375,"selectedPaths":376,"source":323,"sourceLanguage":18,"type":245},"skills/chdb-sql","In-process ClickHouse SQL engine for Python — run ClickHouse SQL queries directly on local files, remote databases, and cloud storage without a server. Use when the user wants to write SQL queries against Parquet/CSV/ JSON files, use ClickHouse table functions (mysql(), s3(), postgresql(), iceberg(), deltaLake() etc.), build stateful analytical pipelines with Session, use parametrized queries, window functions, or other advanced ClickHouse SQL features. Also use when the user explicitly mentions chdb.query(), ClickHouse SQL syntax, or wants cross-source SQL joins. Do NOT use for pandas-style DataFrame operations — use chdb-datastore instead.","chdb-sql",{"claudeCode":12},"SKILL.md frontmatter at skills/chdb-sql/SKILL.md",[377,378,379,380,381,383,385],{"path":354,"priority":317},{"path":319,"priority":322},{"path":362,"priority":363},{"path":365,"priority":334},{"path":382,"priority":334},"references/sql-functions.md",{"path":384,"priority":334},"references/table-functions.md",{"path":369,"priority":363},{"basePath":346,"description":387,"displayName":348,"installMethods":388,"rationale":389,"selectedPaths":390,"source":323,"sourceLanguage":18,"type":245},"MUST USE when designing ClickHouse architectures, selecting between ingestion or modeling patterns, or translating best practices into workload-specific system designs. Complements clickhouse-best-practices with decision frameworks and explicit provenance labels.",{"claudeCode":12},"SKILL.md frontmatter at skills/clickhouse-architecture-advisor/SKILL.md",[391,392,393,395,397,399,401],{"path":354,"priority":317},{"path":319,"priority":322},{"path":394,"priority":334},"AGENTS.md",{"path":396,"priority":363},"examples/README.md",{"path":398,"priority":363},"examples/finserv-market-surveillance.md",{"path":400,"priority":363},"examples/observability-high-throughput.md",{"path":402,"priority":363},"examples/siem-security-analytics.md",{"basePath":404,"description":405,"displayName":253,"installMethods":406,"rationale":407,"selectedPaths":408,"source":323,"sourceLanguage":18,"type":245},"skills/clickhouse-best-practices","MUST USE when reviewing ClickHouse schemas, queries, or configurations. Contains 31 rules that MUST be checked before providing recommendations. Always read relevant rule files and cite specific rules in responses.",{"claudeCode":12},"SKILL.md frontmatter at skills/clickhouse-best-practices/SKILL.md",[409,410,411],{"path":354,"priority":317},{"path":319,"priority":322},{"path":394,"priority":334},{"basePath":413,"description":414,"displayName":415,"installMethods":416,"rationale":417,"selectedPaths":418,"source":323,"sourceLanguage":18,"type":245},"skills/clickhouse-client-js/clickhouse-js-node-troubleshooting","Troubleshoot and resolve common issues with the ClickHouse Node.js client (@clickhouse/client). Use this skill whenever a user reports errors, unexpected behavior, or configuration questions involving the Node.js client specifically — including socket hang-up errors, Keep-Alive problems, stream handling issues, data type mismatches, read-only user restrictions, proxy/TLS setup problems, or long-running query timeouts. Trigger even when the user hasn't precisely named the issue; vague symptoms like \"my inserts keep failing\" or \"connection drops randomly\" in a Node.js context are strong signals to use this skill. Do NOT use for browser/Web client issues.\n","clickhouse-js-node-troubleshooting",{"claudeCode":12},"SKILL.md frontmatter at skills/clickhouse-client-js/clickhouse-js-node-troubleshooting/SKILL.md",[419],{"path":354,"priority":317},{"basePath":421,"description":422,"displayName":423,"installMethods":424,"rationale":425,"selectedPaths":426,"source":323,"sourceLanguage":18,"type":245},"skills/clickhousectl-cloud-deploy","Use when a user wants to deploy ClickHouse to the cloud, go to production, use ClickHouse Cloud, host a managed ClickHouse service, or migrate from a local ClickHouse setup to ClickHouse Cloud.","clickhousectl-cloud-deploy",{"claudeCode":12},"SKILL.md frontmatter at skills/clickhousectl-cloud-deploy/SKILL.md",[427],{"path":354,"priority":317},{"basePath":429,"description":430,"displayName":431,"installMethods":432,"rationale":433,"selectedPaths":434,"source":323,"sourceLanguage":18,"type":245},"skills/clickhousectl-local-dev","Use when a user wants to build an application with ClickHouse, set up a local ClickHouse development environment, install ClickHouse, create a local server, create tables, or start developing with ClickHouse. Covers the full flow from zero to a working local ClickHouse setup.","clickhousectl-local-dev",{"claudeCode":12},"SKILL.md frontmatter at skills/clickhousectl-local-dev/SKILL.md",[435],{"path":354,"priority":317},{"sources":437},[438],"manual",{"closedIssues90d":8,"description":440,"forks":231,"homepage":441,"license":238,"openIssues90d":234,"pushedAt":235,"readmeSize":229,"stars":236,"topics":442},"The official Agent Skills for ClickHouse and ClickHouse Cloud","https://clickhouse.ai",[443,211],"agents",{"classifiedAt":445,"discoverAt":446,"extractAt":447,"githubAt":447,"updatedAt":445},1778683910082,1778683905800,1778683908184,[211,209,215,213,214,210,212],{"evaluatedAt":241,"extractAt":283,"updatedAt":241},[],[452,481,509,540,561,590],{"_creationTime":453,"_id":454,"community":455,"display":456,"identity":462,"providers":467,"relations":475,"tags":477,"workflow":478},1778691799740.4863,"k17de5wxjp7msakczxjbt8e7sh86n1c7",{"reviewCount":8},{"description":457,"installMethods":458,"name":460,"sourceUrl":461},"Fast in-memory DataFrame library for datasets that fit in RAM. Use when pandas is too slow but data still fits in memory. Lazy evaluation, parallel execution, Apache Arrow backend. Best for 1-100GB datasets, ETL pipelines, faster pandas replacement. For larger-than-RAM data use dask or vaex.",{"claudeCode":459},"K-Dense-AI/claude-scientific-skills","Polars","https://github.com/K-Dense-AI/claude-scientific-skills",{"basePath":463,"githubOwner":464,"githubRepo":465,"locale":18,"slug":466,"type":245},"scientific-skills/polars","K-Dense-AI","claude-scientific-skills","polars",{"evaluate":468,"extract":473},{"promptVersionExtension":202,"promptVersionScoring":203,"score":469,"tags":470,"targetMarket":216,"tier":217},99,[213,471,472,289,214],"data-processing","performance",{"commitSha":274,"license":474},"MIT",{"repoId":476},"kd79rphh5gexy91xmpxc05h5mh86mm9r",[471,213,214,472,289],{"evaluatedAt":479,"extractAt":480,"updatedAt":479},1778693624979,1778691799740,{"_creationTime":482,"_id":483,"community":484,"display":485,"identity":491,"providers":496,"relations":503,"tags":505,"workflow":506},1778675145461.8557,"k1782c1rcbne9pgyx4be8ww93x86nbs2",{"reviewCount":8},{"description":486,"installMethods":487,"name":489,"sourceUrl":490},"Part of the AlterLab Academic Skills suite. Fast in-memory DataFrame library for datasets that fit in RAM. Use when pandas is too slow but data still fits in memory. Lazy evaluation, parallel execution, Apache Arrow backend. Best for 1-100GB datasets, ETL pipelines, faster pandas replacement. For larger-than-RAM data use dask or vaex.",{"claudeCode":488},"AlterLab-IEU/AlterLab-Academic-Skills","AlterLab Polars","https://github.com/AlterLab-IEU/AlterLab-Academic-Skills",{"basePath":492,"githubOwner":493,"githubRepo":494,"locale":18,"slug":495,"type":245},"skills/data-science/alterlab-polars","AlterLab-IEU","AlterLab-Academic-Skills","alterlab-polars",{"evaluate":497,"extract":502},{"promptVersionExtension":202,"promptVersionScoring":203,"score":498,"tags":499,"targetMarket":216,"tier":501},78,[500,213,291,214,289,472],"data-science","community",{"commitSha":274,"license":474},{"repoId":504},"kd7fqvj70pvyn4r3q9kctpnd7d86mfqd",[291,500,213,214,472,289],{"evaluatedAt":507,"extractAt":508,"updatedAt":507},1778676483564,1778675145461,{"_creationTime":510,"_id":511,"community":512,"display":513,"identity":519,"providers":523,"relations":533,"tags":536,"workflow":537},1778695548458.4036,"k171cqe6hd4yd3ktqnf3qy9z5186mmff",{"reviewCount":8},{"description":514,"installMethods":515,"name":517,"sourceUrl":518},"Design and execute insect population surveys covering survey design, sampling methods, field execution, specimen identification, diversity index calculation including Shannon-Wiener and Simpson indices, statistical analysis, and reporting. Covers defining survey objectives, selecting study sites, determining sampling intensity and replication, choosing sampling methods appropriate to target taxa, standardizing collection effort, recording environmental covariates, identifying specimens to the lowest practical taxonomic level, calculating species richness, Shannon-Wiener diversity (H'), Simpson diversity (1-D), evenness, rarefaction curves, multivariate ordination, and producing survey reports with species lists and conservation implications. Use when conducting baseline biodiversity assessments, monitoring insect populations over time, comparing insect communities across habitats or treatments, assessing environmental impact, or supporting conservation planning with quantitative ecological data.\n",{"claudeCode":516},"pjt222/agent-almanac","survey-insect-population","https://github.com/pjt222/agent-almanac",{"basePath":520,"githubOwner":521,"githubRepo":522,"locale":18,"slug":517,"type":245},"skills/survey-insect-population","pjt222","agent-almanac",{"evaluate":524,"extract":532},{"promptVersionExtension":202,"promptVersionScoring":203,"score":525,"tags":526,"targetMarket":216,"tier":217},100,[527,528,529,530,531,209],"entomology","insects","ecology","biodiversity","survey",{"commitSha":274},{"parentExtensionId":534,"repoId":535},"k170h0janaa9kwn7cfgfz2ykss86mmh9","kd7aryv63z61j39n2td1aeqkvh86mh12",[530,209,529,527,528,531],{"evaluatedAt":538,"extractAt":539,"updatedAt":538},1778701822946,1778695548458,{"_creationTime":541,"_id":542,"community":543,"display":544,"identity":548,"providers":550,"relations":557,"tags":558,"workflow":559},1778695548458.3613,"k17dx6tyy2yb3z5pp1vgmg46ad86nm18",{"reviewCount":8},{"description":545,"installMethods":546,"name":547,"sourceUrl":518},"Fit cognitive drift-diffusion models (Ratcliff DDM) to reaction time and accuracy data with parameter estimation (drift rate, boundary separation, non-decision time), model comparison, and parameter recovery validation. Use when modeling binary decision-making with reaction time data, estimating cognitive parameters from experimental data, comparing sequential sampling model variants, or decomposing speed-accuracy tradeoff effects into latent cognitive components.\n",{"claudeCode":516},"fit-drift-diffusion-model",{"basePath":549,"githubOwner":521,"githubRepo":522,"locale":18,"slug":547,"type":245},"skills/fit-drift-diffusion-model",{"evaluate":551,"extract":556},{"promptVersionExtension":202,"promptVersionScoring":203,"score":525,"tags":552,"targetMarket":216,"tier":217},[553,554,555,289,209],"cognitive-science","modeling","statistics",{"commitSha":274},{"parentExtensionId":534,"repoId":535},[553,209,554,289,555],{"evaluatedAt":560,"extractAt":539,"updatedAt":560},1778698191612,{"_creationTime":562,"_id":563,"community":564,"display":565,"identity":571,"providers":575,"relations":583,"tags":586,"workflow":587},1778695720086.7703,"k176r34g5a5fjn1z1a4gq6v88186nje0",{"reviewCount":8},{"description":566,"installMethods":567,"name":569,"sourceUrl":570},"Designs an A/B test or experiment with clear hypothesis, variants, success metrics, sample size, and duration. Use when planning experiments to validate product changes or test hypotheses.",{"claudeCode":568},"product-on-purpose/pm-skills","measure-experiment-design","https://github.com/product-on-purpose/pm-skills",{"basePath":572,"githubOwner":573,"githubRepo":574,"locale":18,"slug":569,"type":245},"skills/measure-experiment-design","product-on-purpose","pm-skills",{"evaluate":576,"extract":582},{"promptVersionExtension":202,"promptVersionScoring":203,"score":525,"tags":577,"targetMarket":216,"tier":217},[578,579,580,581,209],"ab-testing","experimentation","product-management","a-b-testing",{"commitSha":274},{"parentExtensionId":584,"repoId":585},"k1721116hsfj7zg78w03432n8986n6y8","kd78ksv1wjj826ds5j1sh2kqnx86mhqf",[581,578,209,579,580],{"evaluatedAt":588,"extractAt":589,"updatedAt":588},1778696438706,1778695720086,{"_creationTime":591,"_id":592,"community":593,"display":594,"identity":598,"providers":601,"relations":609,"tags":610,"workflow":611},1778691799740.488,"k1707r3f2j67714pvq6wk0r6y186m2zd",{"reviewCount":8},{"description":595,"installMethods":596,"name":597,"sourceUrl":461},"Differential gene expression analysis (Python DESeq2). Identify DE genes from bulk RNA-seq counts, Wald tests, FDR correction, volcano/MA plots, for RNA-seq analysis.",{"claudeCode":459},"PyDESeq2",{"basePath":599,"githubOwner":464,"githubRepo":465,"locale":18,"slug":600,"type":245},"scientific-skills/pydeseq2","pydeseq2",{"evaluate":602,"extract":608},{"promptVersionExtension":202,"promptVersionScoring":203,"score":525,"tags":603,"targetMarket":216,"tier":217},[604,605,606,607,289,209],"bioinformatics","genomics","rna-seq","deseq2",{"commitSha":274,"license":474},{"repoId":476},[604,209,607,605,289,606],{"evaluatedAt":612,"extractAt":480,"updatedAt":612},1778693766611]