Skip to content

Using Simplescraper's MCP Server

Model Context Protocol (MCP) is a standardized way for AI applications like Claude, Cursor and ChatGPT to interact with web apps like Simplescraper.

By connecting to Simplescraper's MCP server, you can automatically scrape websites, manage recipes and analyze scraped data using any AI application or AI agents.

Simplescraper's MCP server URL:

https://mcp.simplescraper.io/mcp



Quick Setup


Claude Desktop

  1. Open Claude Desktop settings
  2. Navigate to Settings (bottom-left) → Integrations
  3. Click "Add MCP Server"
  4. Enter these details:
    • Name: Simplescraper
    • URL: https://mcp.simplescraper.io/mcp
  5. Complete the authentication flow to connect your Simplescraper account
  6. Start using natural language to control your scrapes!

Claude Integrations availability

Integrations are currently available on the Max plan, with broader availability coming soon.


Cursor IDE

Cursor supports MCP with OAuth authentication. When you add the server, Cursor will open a browser window for you to log in with your Simplescraper account.

Global Setup (All Projects):

  1. Create ~/.cursor/mcp.json in your home directory
  2. Add this configuration:
json
{
  "mcpServers": {
    "simplescraper": {
      "url": "https://mcp.simplescraper.io/mcp"
    }
  }
}
  1. When you toggle the server on in Cursor, it will prompt you to authenticate with your Simplescraper account

Project-Specific Setup:

For tools specific to a project, create a .cursor/mcp.json file in your project directory:

  1. Create .cursor/mcp.json in your project root
  2. Use the same JSON configuration as above
  3. The server will only be available in that project

Via Cursor Settings UI:

  1. Open Cursor Settings (Ctrl/Cmd + ,)
  2. Navigate to MCP in the sidebar
  3. Click "Add Server"
  4. Enter:
    • Name: Simplescraper
    • URL: https://mcp.simplescraper.io/mcp
  5. When prompted, log in with your Simplescraper account

Do not add an Authorization header manually

Cursor handles authentication automatically through OAuth. Adding "headers": {"Authorization": "Bearer ..."} with your API key will prevent the login flow from working. Use the config above with just the URL.



Other AI Applications

Most AI applications that support MCP will have a similar setup process. Look for MCP or "Tools" settings and add our server URL: https://mcp.simplescraper.io/mcp



Available Capabilities

The Simplescraper MCP server provides these tools to AI applications:


Recipe Management

  • list_recipes - Find recipes by domain, keyword, or recent activity
  • get_recipe - View full details of a specific recipe including selectors
  • create_recipe - Build new scrape recipes with CSS selectors
  • update_recipe - Modify existing recipes (name, URL, or selectors)

Data Extraction

  • run_recipe - Execute a scrape to get fresh data
  • smart_extract - Use AI to extract data without CSS selectors
  • get_latest_results - Retrieve the most recent scraped data
  • get_results - Fetch results by specific result ID
  • get_results_history - View past scrape runs for a recipe

Batch Operations

  • update_batch_urls - Manage lists of URLs for batch scraping

Example Prompts

Here are some ways to interact with Simplescraper through Claude:


Smart Extract a Website

"Extract product names, prices, and ratings from https://example.com/products"

Claude will use the smart_extract tool to intelligently identify and extract the requested data.


View Recent Results

"Show me the latest results from my Twitter profile scraper"

Claude will find your recipe and display the most recent scraped data.


Update Selectors

"The price selector on my Amazon scraper needs updating to .new-price-class"

Claude will update the specific selector in your recipe.


Filter Recipes by Domain

"List all my recipes that scrape from linkedin.com"

Claude will show all recipes targeting LinkedIn.


Analyze Scraped Data

"Show me my liked tweets and organize them by topic"

Claude will retrieve your Twitter data and categorize it for you.


Manage Batch URLs

"Add these 10 product URLs to my product details scraper"

Claude will update the batch URL list for crawler-based scraping.


Advanced Usage Examples


Complex Workflows

"Check my Amazon price tracker recipe. If prices have changed by more than 10% 
since the last run, create a summary report of the changes."

Multi-Step Operations

"Find all recipes that haven't run in the last week, run them, 
and summarize any that return errors."

Data Transformation

"Get the latest results from my job listings scraper and format them 
as a markdown table sorted by salary."


Getting Help