We do not apply any default crawl delay directive on our servers.
You can decide what crawl-delay, if any, should be applied to your site(s) by editing your robots.txt file. Crawl-delay can be set by adding a line in this file with the following syntax:
It instructs some search engine bots (like Yandex, Yahoo!, Bing, etc.) about the minimal interval of time in seconds you allow between their requests. In the example above the instruction is that the same bot should keep an interval of 1 second between each page visit it makes. The aim of this line is to prevent your site being crawled too aggressively, as this may cause problematic peaks in your account resource usage or may take unnecessary amount of resources.
Don’t forget – two of the SEO essentials are its loading speed and its geographical location. That is why it is important to choose the right host. Check out joombig template and extension services.