Sitemap
An XML file that lists all important pages on your website to help search engines discover and crawl them.
Understanding Sitemap
An XML sitemap is a structured file that lists the URLs on your website along with metadata like last modification date, change frequency, and priority. It helps search engines discover pages that might be hard to find through normal crawling, especially on large or new sites. Dynamic sitemaps that automatically update when content is added or changed are recommended over static ones. Submit your sitemap through Google Search Console and reference it in your robots.txt file. For large sites, sitemap indexes can organize multiple sitemaps by section (blog, products, categories).
Keep learning
Crawl Budget
The number of pages a search engine will crawl on your site within a given timeframe.
Robots.txt
A text file that instructs search engine crawlers which pages they can or cannot access.
Indexing
The process by which search engines store and organize web pages in their database for retrieval in search results.
Track sitemap and more with Optic Rank
Get AI-powered SEO intelligence that puts glossary knowledge into actionable insights.