Site Audit FAQ
Table of contents
1. How can I create a project?
There are two main ways on how you can create a project:1. Click on the Sitechecker logo in the top left corner and then click on “Create a project” in the top right corner.
Hover your cursor over the project you are in right now in the top left corner. A menu with all the projects available on your account will appear, along with a button to create a project.
Click on “Create a project” to proceed in adding a new website to your account.
You can access project settings almost from any section on the Sitechecker platform.
Just look for the “Settings” button in the top right corner of the screen in Dashboard, Site Audit, Site Monitoring, Rank Tracker, or Backlink Tracker.
It will look like this:
You can change the amount of crawled pages, what links to crawl via custom robots.txt rules, how frequently you want your project to re-crawl itself, and much more.
2. How do I find where the link in question can be on my website?
For each link that is present in the audit, we provide a list of Anchors (internal backlinks), external (links that lead to another website), and internal (links that lead to other URLs within your website) links. They will be listed right under the URL and it will look like this:
By clicking on Anchors you will see a list of where this link can be found in code on your website.
3. I can’t find certain pages in Audit. What can I do?
Make sure that the link in question can be accessed from one of the pages on your website and does not require any manual input.
You can also use the search option in the “Issues” section to find the page you are interested in.
4. My website requires a login or acceptance of cookie policies to access the content. Will Sitechecker work with my website?
Sitechecker can’t crawl websites that require manual input. Try disabling a login form or cookie requirement for Site Audit purposes and enabling it once the Audit is finished.
Please be informed that this procedure will be a must for all other Site Audit checks in the future.
5. How Website Score is calculated?
We use the following formula to calculate this metric.
Website Score = (sum OnePageScore) / # of pages
OnePageScore = 100 - the cost of critical error one - the cost of critical error two - the cost of warning one and so on.
Cost of specific critical error = (60 * # of specific errors) / # of all critical errors
Cost of specific warning = (40 * # of specific warnings) / # of all warnings
You can learn more about how the Website Score is calculated over this link.
6. My audit is loading for more than 10 minutes.
If you have an e-commerce website or a project with a lot of pages, it might take a bit longer considering the amount and complexity of the pages. Pay attention to the progress bar and the amount of crawled pages. If this number does not change for more than 30 minutes, please let us know in our support chat, we will be happy to help.
7. My Site Audit has only 1 page checked.
Make sure that your page does not have any firewall that prevents bots from crawling your website further and you have links in code for our crawler to follow through.
Make sure that your website architecture has properly set links to other parts of your website and there are no disruptions while browsing on your main page. Another tip would be to see if your code is loaded dynamically instead of loading it all together at once. If your code loads dynamically, our bot will have issues finding links to go through.
Please be informed that if you are checking a landing website, it might actually have only 1 page. If you are certain that your website has more than 1 page, make sure you have enough URLs assigned to this project in Site Audit settings.
8. I found links in Site Audit that are disallowed by my robots.txt.We provide an option to respect or ignore your robots.txt rules.
This can be changed by navigating to Settings and going to Site Audit Settings. On this page, you are able to switch the toggle to respect robots.txt as well as configure the robots.txt specifically for Sitechecker.
Make sure that rules in your robots.txt are configured correctly. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file.
9. I found links in Site Audit that are automatically generated or non-existent.
We suggest excluding these pages in Settings of the project. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings and click on “Edit robots.txt file”. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file.
10. Does Sitechecker fix the issues it finds on the website?
We provide Auditing and other SEO solutions as an easy information source about the state of your technical and ranking performance. Please be informed that our app does not make any changes to your website, neither it automatically provides fixes for found errors.
11. Do I need to create a project each time I need to update the results?
No, in order to get an update on your audit results, click on the Sitechecker logo in the top left corner to get to the list of existing projects or just select the project in question from the drop-down menu right under the logo.
When the project is selected, navigate to the Site Audit section from the left sidebar. When in Site Audit click on the “Re-crawl” button in the top right corner. After the Audit is finished, you will get updated results for your project.
12. I fixed the issue, but the information stays the same.
Navigate to the Site Audit section from the left sidebar. When in Site Audit click on the “Re-crawl” button in the top right corner. After the Audit is finished, you will get updated results for your project. If the mentioned issue stays in Audit, please double-check your fix for the mentioned issue. We suggest using our “How to fix” guides available near each issue to help you get on the right track.
13. Can I remove the issue/warning from the Audit, if I feel it’s not applicable?
To remove the issue/warning from Site Audit, hover over the issue/warning in question with your cursor, and the button “Ignore issue” will appear right next to the “How to fix” article. It will look like this:
Please be informed that you can reverse this action in Settings of the project at any time.
14. Can I remove a certain page from the Audit?
You can set a specific rule in our virtual robots.txt. To do so, navigate to the top right corner of the “Site Audit” section and click on “Settings”, then go to “Site Audit settings” and click on “Edit robots.txt file”.
Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file. To know how to form proper robots.txt rules, please refer to this guide from Google.
15. My project returns a “0” or “-” in Site Audit. What can be the cause?
Make sure that you have enough URLs assigned to the project in question. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings.
There you will see an indication of how many pages you have dedicated to this specific project. If you have used up all the available URLs on other projects, you can change the amount you assigned in Site Audit settings in other projects.
16. How crawling of the website works?
Our app scans your initial page for links that are a part of your website. When all the links from one URL are gathered, we move through those links to find new ones. The cycle continues until no new links are found.
17. Page report indicates that website page has a low speed. What can I do?
The most common practices are to optimize the images on your website by resizing or compressing them and make sure that your hosting provides you with great speeds.
18. My Title/H1 duplicates are a part of the pagination/blog pages. What can I do in this case?
It’s better to exclude them in our virtual robots.txt. To do so, navigate to the top right corner of the Site Audit section and click on “Settings”, then go to Site Audit settings and click on “Edit robots.txt file”. Please be informed that the rules made in Sitechecker robots.txt will not apply or edit to your actual robots.txt file.
If you have architecture like:
site.com/blog/post/page/1
Then use:
User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)
Disallow: /page/
If you have architecture like:
site.com/blog/post?p=1
Then use:
User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)
Disallow: /*?p=
If you have architecture like:
site.com/blog/post?page=1
Then use:
User-agent: SiteCheckerBotCrawler/1.0 (+http://sitechecker.pro)
Disallow: /*?page=
19. I have Title/H1/Description duplicates issue in Site Audit. How to see what pages have duplicates?
Duplicates mean that 2 or more pages have the same content in a specific tag. To see what pages are duplicates, click on the issue in question.
When you will see a list of URLs, click on “See duplicates” on the right side of the URL. The list of pages with the same Title/H1/Description will appear.