CF AI Gateway
MCP server for Cloudflare AI Gateway — manage AI request routing, caching, rate limiting, and observability across LLM providers.
Skills for AI model inference, RAG pipelines, image generation, and ML operations
Most starred: Gradio (10,420), Hugging Face CLI (10,420), HF Datasets (10,420)
MCP server for Cloudflare AI Gateway — manage AI request routing, caching, rate limiting, and observability across LLM providers.
MCP server for Cloudflare AutoRAG — build and query retrieval-augmented generation pipelines with Vectorize and Workers AI.
MCP server for Google Vertex AI Creative Studio — generate images, edit media, and create visual content using Vertex AI models.
MCP server for Amazon Bedrock Knowledge Bases — query RAG-powered knowledge bases with foundation model retrieval and generation.
MCP server for Amazon Nova Canvas — generate, edit, and manipulate images using the Nova foundation model on Bedrock.
Guide for Netlify AI Gateway — access AI models from OpenAI, Anthropic, and Google via a unified proxy without managing API keys directly.
Build Gradio web UIs and demos in Python — create apps, components, event listeners, layouts, and chatbots.
Skill for executing Hugging Face Hub operations such as downloading models, datasets, and Spaces, uploading files, creating repos, managing local cache, and running compute jobs.
Create and manage datasets on Hugging Face Hub — initialize repos, define configs, stream row updates, and SQL-based querying.
Add and manage evaluation results in Hugging Face model cards — extract eval tables, import scores, and run custom evaluations.
Skill for running workloads on Hugging Face Jobs infrastructure, including UV scripts, Docker-based jobs, hardware selection, cost estimation, and secrets management.
Skill for training or fine-tuning language models with TRL on Hugging Face Jobs, including SFT, DPO, GRPO, reward modeling, GGUF conversion, and Trackio monitoring.
Publish and manage research papers on Hugging Face Hub — create paper pages, link to models/datasets, claim authorship.
Skill for building reusable CLI scripts for Hugging Face API operations, chaining API calls, and automating repeated tasks.
Track and visualize ML training experiments with Trackio — log metrics via Python API, fire training alerts, and retrieve logged metrics with real-time dashboards.
Official NVIDIA NeMo Agent Toolkit capability for publishing workflow functions as MCP tools through nat mcp serve and fastmcp server run.
Official Redis MCP server focused on durable agent memory workflows, distinct from the general Redis MCP Server and the Redis Cloud API MCP Server.
Official Pinecone developer MCP server for documentation search, index management, record upsert, and search workflows.
Official Pinecone Assistant MCP server for retrieving information from Pinecone Assistant knowledge bases.
Official Weights & Biases MCP server for querying runs, traces, reports, projects, and documentation through natural language.
Official Replicate docs-backed remote MCP server for model search, metadata retrieval, prediction execution, and other Replicate API workflows.