Project Settings FAQ

Table of content

1. I have set 30000 pages to crawl on my website, why there were only 4000 URLs crawled?

The maximum number of URLs to be crawled for one project is 30.000. However, even if your website has more pages than 30.000 it does not mean that our user agent will be able to crawl all of them. There are a few reasons for that. First is the most obvious reason. Your website has such robots.txt rules that do not allow our bot to crawl the specific parts. The second reason is that you did not select the correct filters in the settings for our bot (see the screenshot). The last common reason is that your website is overloaded with our traffic, and some pages may return 5xx status code, consequently, the other pages that are linked to such pages will not be crawled.

2. I have purchased the plan with a 1500 URLs limit, but as soon as I add a new domain it says that the limits are reached.  

In case you subscribe, for example, for Basic Monthly plan with 1500 URLs limit, it means that you have 1500 URLs in total for 3 websites. In case you set the maximum limit of 1500 URLs for one project, it means that there will be no space for other projects. In other words, even if your website has only 500 URLs and you set in the settings of Site Audit a limit of 1500 for one project, it will take all the pages from the limit. Thus, make sure to check how many pages your website has and update the limits accordingly. It is best to leave a little bit of space, so if your website has 500 URLs, just put 600 in the settings to be safe.

3. If I change the settings of my Robots.txt file in Sitechecker, will it change the actual settings on my website?

No, it will not. It is absolutely safe to change and customize the Robots.txt settings in Sitechecker, as it only sets the rules for our own bot.  

4. What does the amount of parallel requests mean?

The amount of parallel requests is the number that shows you how many requests our bot sends at the specified time frame. The more requests it sends, the faster our bot goes through your website and the quicker you get a Site Audit report accordingly. It is a really helpful feature, however, if you are not sure about the quality of your hosting, it is better to set fewer requests in order to avoid overloading and, as a result, appearing of 5xx server errors.  

5. Where can I see the same settings for a One-Time Site Audit? 

As of now, such settings for One-Time Site Audit are not available. It is only possible to customize the settings for the main projects you add to our tool.

6. How to add a few countries or devices for Rank Tracker to one project?

In order to track the statistics for many countries in the Rank Tracker for the same domain, you will need to create a few separate projects where the only difference is either country/city/device accordingly. The same goes for the search engine and language. However, make sure to set 0 pages for crawling in the Site Audit settings in order not to waste your limits for crawling in your duplicated projects.

7. How do I group my keywords in Rank Tracker into custom categories?

In order to create custom groups, you will need to go to the “Keywords” section in the project settings and select the keywords you would like to group. After this just click on the last button of the sorting bar and create a custom group of keywords. 

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us