The purpose of the robots.txt file is to tell the search bots which files should and which should not be indexed by them. Most often it is used to specify the files which should not be indexed by search engines.
In order to allow search bots to crawl and index the entire content of your website, you can add the following lines in your robots.txt file:
User-agent: *
Disallow:
On the other hand, if you wish to disallow your website from being indexed entirely, you can use the lines below:
User-agent: *
Disallow: /
For more advanced results you will need to understand the sections in the robots.txt file. The "User-agent:" line specifies for which bots the settings should be valid. You can use "*" as a value to create the rule for all search bots or the name of the bot you wish to make specific rules for.
The "Disallow:" part defines the files and folders that should not be indexed by search engines. Each folder or file must be defined on a new line. For example, the lines below will tell all the search bots not to index the "private" and "security" folders in your public_html folder:
User-agent: *
Disallow: /private
Disallow: /security
Please note that the "Disallow:" statement uses your website root folder as a base directory, therefore the path to your files should be /sample.txt and not /home/user/public_html/sample.txt for example.
Don’t forget – two of the SEO essentials are its loading speed and its geographical location. That is why it is important to choose the right host. Check out joombig template and extension services.
Related Article
destination source:https://www.siteground.com/kb/how_to_use_the_robotstxt_file/