One of the most important things any website needs to be able to do is crawl their URLs. Mainly for SEO reasons and web development purposes. Man on page SEO factors can be found in a simple scan, so have a routined scan is optimal. Many free ways to crawl and export website URLs exist.

For website indexing optimization and meta tag management, crawling a website is the only way to make sure every URL is addressed. Crawling a website like a searchbot can also help highlight SEO issues not thought about before. Creating even more of a reason to make sure every website gets routine scans to keep SEO health in check.

Screaming Frog Is Often An SEOs First Weapon Of Choice

This is in no way an endorsement, but Screaming Frog is one of the most useful SEO tools around! The free version will allow up to 500 URLs to be scanned, which is really not bad. Everything from meta titles to h1 tags and even image alt tags can be scanned and found. Exports to .csv are a breeze and if you’re in a hurry just copy and paste!

Find broken links, errors and redirects and even have an xml sitemap generated. Something really cool that the free version offers is something called ” site visualization “, where a fusion chart is displayed. Interactive this fusion chart allows for quick understandings of URL architecture. If not found useful it certain can impress and interest.

Xenu’s Link Sleuth Is A Great Open Source Alternative

Xenu’s Link Sleuth is an old time tool used by many SEOs and Webmasters alike. Xenu is a Windows based application meant for webpage crawling. The main difference with Xenu’s reports is that they’re webpage based. No fancy export or visuals, however it is very organized. It’s really useful when you need to scan more than 500 pages and are on a budget.

Each URL is able to have some interesting details, such as Google cache, and internet archive. Very useful to find content from 404 webpages. If in a rush and only need a few pages, simply pressing ” R ” can generate a report, even if the scan isn’t finished. 

Search Engine Advanced Search Operators Can Be Useful

Google search itself can be a great way to crawl a website. Using something called ” Advanced Search Operators ” a specific search can be done to deliver restricted results. For example using ” site:domain.com ” will bring up many of the website pages in SERPs. Now remember the number in the results will not always reflect the number of webpages on a website. For a number of reasons a website could have more or less so relying on that number to be 100% accurate isn’t recommended.

The exporting of these results can then be done via Google Sheets and something called web scraping. Using a specific piece of code can Google Sheets into a web scraper and display data selected. This is very useful to display meta titles, meta descriptions and even URLs. Considering the endless possibilities of using advanced operators with the power of a web scraper, many mundane SEO tasks can be automated.

Does your website need a routine SEO audit?

Contact SEOByMichael to schedule an SEO audit consultation, and let’s develop a website success strategy today!