Overview
Use the Crawl MCP server from your ATXP-powered agent to search and extract information from the web. The Crawl MCP server can be used to:- crawl up to a specificied maximum number of pages
- extract information from websites
Example prompts
- “Scrape https://docs.atxp.ai and give me the text from the page.”
- “Crawl https://www.baseball-reference.com/teams/NYM/ and give me the details on the Mets.”
Cloudflare pay per crawl
The Crawl MCP server is compatible with Cloudflare’s pay per crawl scheme. If you instruct the service to crawl a website with pay per crawl enabled, the cost of the tool call will include the added fee imposed by the content provider.Tools
crawl_scrape
crawl_scrape
Scrape a website and return the text content. It is useful if you need a single page of text from a website.
Arguments
Accepts a JSON object with the following properties:The URL of the website to scrape.
Response
A JSON object with the following properties:The status of the scrape operation. The
status
key will have the value “success” when the scrape is complete and HTML content was found. If the scrape fails to find any HTML content, the status
key will have a value of “error”.The HTML content scraped from the specified URL.
crawl_crawl
crawl_crawl
Crawl a website and return the text content. It is useful if you need to crawl a website and get all the text content.
Arguments
Accepts a JSON object with the following properties:The URL of the website to crawl.
The maximum number of pages to crawl. The default value is 10.
Response
Returns a JSON object with the following properties:The status of the crawl operation. The
status
key will have the value “success” when the crawl is complete.The text content crawled from the specified URL.
The ID of the crawl task.
The estimated time in seconds until the crawl is complete.
Usage
1
Define the Crawl service
Create a reusable service configuration that points to the MCP server and standardizes how you pass arguments and read results. This lets your agent easily interact with the Crawl tools in a consistent manner.
2
Create an ATXP client
- Using an ATXP account
- Using a Base account
- Using a Solana account
- Using a Worldchain account
- Using a Polygon account
Create a client using an ATXP account by importing the ATXP client SDK and other dependencies.
3
Use the Crawl service in your agent
Call the Crawl tool by passing your natural‑language instruction as the argument the
getArguments
method.Read the response using the getResult
method.You should see the content of the crawled pages printed in your console.