Indexing the pages of a website is the number one step to ranking them on search result pages. So when you get a notification that some pages failed the crawl test, it puts you in a state of disarray and confusion.
This solves the problem:
- Confirm domain ownership (easy peasy) either through G-tags or your GA4 account.
- Check the domain endpoint. Google rarely indexes endpoints that are critical. An example of a critical end point - sureway.com/dashboard. These type of pages must be excluded from being crawled by search engines using the Disallow rule in your robots.txt file.
- Ensure unindexed pages are internally linked to the primary domain.
- Irrelevant content. Only include relevant pages for index.
- Run technical audits on affected pages to fix broken links or other negatives like page speed. Should be <5s.
Bonus Points: Sometimes all you have to do is literally do nothing and simply wait. Antithetical yes but does it work yes.
Top comments (0)