Hub documentation

Agents on the Hub

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Agents on the Hub

Hugging Face provides tools and protocols that connect AI agents directly to the Hub. Whether you’re chatting with Claude, building with Codex, or developing custom agents, you can access models, datasets, Spaces, and community tools. This page covers connecting your chat agents and coding agents to the Hub.

To build with agents on the Hub, check out the pages on MCP Server, Skills, CLI, and SDK.

Chat with Hugging Face

Connect your AI assistant directly to the Hugging Face Hub using the Model Context Protocol (MCP). Once connected, you can search models, explore datasets, generate images, and use community tools—all from within your chat interface.

Supported Assistants

The HF MCP Server works with any MCP-compatible client:

  • ChatGPT (via plugins)
  • Claude Desktop
  • Custom MCP clients

Setup

1. Open MCP Settings

MCP Settings Example

Visit huggingface.co/settings/mcp while logged in.

2. Select Your Client

Choose your MCP-compatible client from the list. The page shows client-specific instructions and a ready-to-copy configuration snippet.

3. Configure and Restart

Copy the configuration snippet into your client’s MCP settings, save, and restart your client.

The settings page generates the exact configuration your client expects. Use it rather than writing config by hand.

What You Can Do

Once connected, ask your assistant to use Hugging Face tools among the ones you selected in your configuration:

Task Example Prompt
Search models “Find Qwen 3 quantizations on Hugging Face”
Explore datasets “Show datasets about weather time-series”
Find Spaces “Find a Space that can transcribe audio files”
Generate images “Create a 1024x1024 image of a cat in Ghibli style”
Search papers “Find recent papers on vision-language models”

Your assistant calls MCP tools exposed by the Hugging Face server and returns results with metadata, links, and context.

Add Community Tools

Extend your setup with MCP-compatible Gradio Spaces:

  1. Browse Spaces with MCP support
  2. Add them in your MCP settings
  3. Restart your client to pick up new tools

Gradio MCP apps expose their functions as tools with arguments and descriptions, so your assistant can call them directly.

Learn More

Coding Agents

Integrate Hugging Face into your coding workflow with the MCP Server and Skills. Access models, datasets, and ML tools directly from your IDE or coding agent. For example, we cover these coding agents and more with MCP and/or Skills:

Coding Agent Integration Method
Claude Code MCP Server + Skills
OpenAI Codex MCP Server + Skills
Open Code MCP Server + Skills
Cursor MCP Server
VS Code MCP Server
Gemini CLI MCP Server
Zed MCP Server

Quick Setup

MCP Server

The MCP Server gives your coding agent access to Hub search, Spaces, and community tools.

Cursor / VS Code / Zed:

  1. Visit huggingface.co/settings/mcp
  2. Select your IDE from the list
  3. Copy the configuration snippet
  4. Add it to your IDE’s MCP settings
  5. Restart the IDE

Claude Code:

claude mcp add hf-mcp-server -t http "https://huggingface.co/mcp?login"

Skills

Skills provide task-specific guidance for AI/ML workflows. They work alongside MCP or standalone.

# start claude 
claude

# install the skills marketplace plugin
/plugin marketplace add huggingface/skills

Then, to install a Skill specification:

/plugin install hugging-face-cli@huggingface/skills

See the Skills Guide for available skills and usage.

What You Can Do

Once configured, your coding agent can:

Capability Example
Search the Hub “Find a code generation model under 7B parameters”
Generate images “Create a diagram of a transformer architecture”
Explore datasets “What datasets are available for sentiment analysis?”
Run Spaces “Use the Whisper Space to transcribe this audio file”
Get documentation “How do I fine-tune a model with transformers?”

Environment Configuration

Authentication

Set your Hugging Face token as an environment variable:

export HF_TOKEN="hf_..."

Or authenticate via the CLI:

hf auth login

Adding Community Tools

Extend your setup with MCP-compatible Gradio Spaces:

  1. Browse Spaces with MCP support
  2. Add them in your MCP settings
  3. Restart your IDE

Example Workflow

You: Find a text classification model that works well on short texts

Agent: [Searches Hugging Face Hub]
       Found several options:
       - distilbert-base-uncased-finetuned-sst-2-english (sentiment)
       - facebook/bart-large-mnli (zero-shot)
       ...

You: Show me how to use the first one

Agent: [Fetches documentation]
       Here's how to use it with transformers:
       
       from transformers import pipeline
       classifier = pipeline("sentiment-analysis", 
                            model="distilbert-base-uncased-finetuned-sst-2-english")
       result = classifier("I love this product!")

Next Steps

  • MCP Server - Connect any MCP-compatible AI assistant to the Hub
  • Skills - Pre-built capabilities for coding agents
  • CLI - Command-line interface for Hub operations
  • SDK - Python and JavaScript libraries for building agents
Update on GitHub