Crawling
Definition
Crawling is the process of a search engine bot fetching pages and resources from a site.
Why It Exists
Search engines need to discover pages and refresh their understanding of existing content.
How It Fits Into The System
Crawling comes before indexing. Sitemaps, links, robots rules, redirects, and server responses all affect crawling.
Important Details
Crawling is not the same as indexing. A bot can fetch a URL and still decide not to index it.
Example
Googlebot requests /posts/build100-manifest to inspect the page content and metadata.