Site Audit FAQ

Table of contents

How can I create a project?

There are two main ways how you can create a project:

  1. In your Project List that is accessible after the click on the Sitechecker logo.
  2. From the dropdown menu with all your existing projects.

Click on “Create a project” to proceed in adding a new website to your account.

You can access project settings almost from any section on the Sitechecker platform.Just look for the “Settings” button in the top right corner of the screen in Dashboard, Site Audit, Site Monitoring, Rank Tracker, or Backlink Tracker.

It will look like this:

How do I find where the link is located on other pages of my website?

After you choose the URL you want to inspect, just click on it and you will see the Page Details report. There, you will need to go to the linking section and there you will see all the connections with other URLs (internal and external) it has.

How do I find a specific page in the Site Audit?

Go to the Site Audit summary and here click on the "View all affected pages" button.

Here you can filter all the pages by different conditions and locate the needed URLs.

My website requires a login or acceptance of cookie policies to access the content. Will Sitechecker work with my website?

Sitechecker can’t crawl websites that require manual input. Try disabling a login form or cookie requirement for Site Audit purposes and enabling it once the Audit is finished.

Please be informed that this procedure will be a must for all other Site Audit checks in the future.

How Website Score is calculated?

We use the following formula to calculate this metric.

Website Score = (sum OnePageScore) / # of pages

OnePageScore = 100 - the cost of critical error one - the cost of critical error two - the cost of warning one and so on.

Cost of specific critical error = (60 * # of specific errors) / # of all critical errors

Cost of specific warning = (40 * # of specific warnings) / # of all warnings

You can learn more about how the Website Score is calculated over this link.

My audit is loading for more than 10 minutes.

If you have an e-commerce website or a project with a lot of pages, it might take a bit longer considering the amount and complexity of the pages. Pay attention to the progress bar and the amount of crawled pages. If this number does not change for more than 30 minutes, please let us know in our support chat, or via email at support@sitechecker.pro we will be happy to help.

My Site Audit has only 1 page checked. 

Make sure that your page does not have any firewall that prevents bots from crawling your website further and you have links in code for our crawler to follow through. 

Make sure that your website architecture has properly set links to other parts of your website and that there are no disruptions while browsing on your main page. Another tip would be to see if your code is loaded dynamically instead of loading it all together at once. If your code loads dynamically, our bot will have issues finding links to go through.

Please be informed that if you are checking a landing website, it might actually have only 1 page. If you are certain that your website has more than 1 page, make sure you have enough URLs assigned to this project in Site Audit settings.

I found links in Site Audit that are disallowed by my robots.txt.

We provide an option to respect or ignore your robots.txt rules. This can be changed by navigating to Settings and going to Site Audit Settings. On this page, you are able to switch the toggle to respect robots.txt as well as configure the robots.txt specifically for Sitechecker.

Make sure that rules in your robots.txt are configured correctly. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file. 

I found links in Site Audit that are automatically generated or non-existent.

We suggest excluding these pages in Settings of the project. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings and click on “Edit robots.txt file”. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file.

Does Sitechecker fix the issues it finds on the website?

We provide Auditing and other SEO solutions as an easy information source about the state of your technical and ranking performance. Please be informed that our app does not make any changes to your website, neither it automatically provides fixes for found errors.

Do I need to create a project each time I need to update the results?

No, in order to get an update on your audit results, click on the Sitechecker logo in the top left corner to get to the list of existing projects or just select the project in question from the drop-down menu right under the logo. 

When the project is selected, navigate to the Site Audit section from the left sidebar. When in Site Audit click on the “Re-crawl” button in the top right corner. After the Audit is finished, you will get updated results for your project.

12. I fixed the issue, but the information stays the same.

Navigate to the Site Audit section from the left sidebar. When in Site Audit click on the “Re-crawl” button in the top right corner. After the Audit is finished, you will get updated results for your project. If the mentioned issue stays in Audit, please double-check your fix for the mentioned issue. We suggest using our “How to fix” guides available near each issue to help you get on the right track.

Can I remove the issue/warning from the Audit, if I feel it’s not applicable?

To remove a particular issue from Site Audit, first will need to click on this. As soon as you are on the screen of the issue, locate the "Ignore" button and click on it.

Please be informed that you can reverse this action in the Settings of the Site Audit at any time.

Can I remove a certain page from the Audit?

You can set a specific rule in our virtual robots.txt. To do so, navigate to the top right corner of the “Site Audit” section and click on “Settings”, then go to “Site Audit settings” and click on “Edit robots.txt file”.

Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file. To know how to form proper robots.txt rules, please refer to this guide from Google.

My project returns a “0” or “-” in Site Audit. What can be the cause?

Make sure that you have enough URLs assigned to the project in question. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings. 

How crawling of the website works?

Our app scans your initial page for links that are a part of your website. When all the links from one URL are gathered, we move through those links to find new ones. The cycle continues until no new links are found. Also, your Sitemap and your Google Search console (if connected) may help us in finding URLs on your website.

Page report indicates that the website page has a low speed. What can I do?

The most common practices are to optimize the images on your website by resizing or compressing them and make sure that your hosting provides you with great speeds.

My Title/H1 duplicates are a part of the pagination/blog pages. What can I do in this case?

It’s better to exclude them in our internal robots.txt. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings and click on “Edit robots.txt file”. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file.

If you have architecture like:

site.com/blog/post/page/1

Then use:

User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)Disallow: /page/

If you have architecture like:

site.com/blog/post?p=1

Then use:

User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)Disallow: /*?p=

If you have architecture like:

site.com/blog/post?page=1

Then use:

User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)Disallow: /*?page=

I have Title/H1/Description duplicates issue in Site Audit. How to see what pages have duplicates?

Duplicates mean that 2 or more pages have the same content in a specific tag. To see what pages are duplicates, click on the issue in question.

When you will see a list of URLs, click on “See duplicates” on the right side of the URL. The list of pages with the same Title/H1/Description will appear.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us