Skip to main content
All CollectionsSite Audit
Running Site Audit
Running Site Audit

How to set up the Site Audit tool.

Maja Nagelj avatar
Written by Maja Nagelj
Updated over a week ago

Why Site Audit?

Nightwatch can crawl your website automatically on a regular basis to provide you an exhaustive analysis so that you can make specific improvements to your website and its specific pages. You can even set it to filter based on the specific conditions to analyze the different data of your website better.

This feature will help you know the condition of your website. For example: which pages of your website are throwing specific response codes, which pages have a specific amount of broken links, what are the load times of the different pages, and more.

On this page:


Start running Site Audit

Here's how you can start to run the Site Audit feature to crawl your website:

  • Click on Site Audit in the sidebar on the left.

  • After that, you will be taken to the Site Audit settings page, where you can customize your settings. (See below)

  • If you would like to crawl your website regularly automatically - toggle on auto-crawling.

  • Click 'Start Crawling.'


Customizing your Site Audit settings

Image 2021-06-22 at 2.09.55 PM

To ignore specific link paths from being crawled, you can insert them in the "Ignore link paths" section shown in the image above. You can:

  • Either insert specific full links where it will effectively ignore them only,

  • Or you can add partial links where it will act similar to a wildcard:

    • Inserting example.com/2018 will effectively also ignore all its sub-paths.
      For example, these pages will be ignored:

      • example.com/2018/blog

      • example.com/2018/categories

    • Inserting incomplete links such as example.com/wp- will effectively ignore all links that contain the same strings. For example, these pages will be ignored:

      • example.com/wp-admin

      • example.com/wp-cli

      • example.com/wp-content

Crawling concurrency level: If you're experiencing failed pages, try reducing this value. It will slow down the crawling and put less strain on the server where your website is hosted.


Check out our tutorial video on Youtube:

Did this answer your question?