Skip to main content

How to Activate Robots.txt file?

Search engines like Google or Yahoo deploy automated bots that routinely scan websites, seeking out information such as blogs, images etc. and they decide what should be included in their indexes. A robots.txt file provides these search engines with the necessary information to properly crawl and index a website's content.

The robots.txt file empowers website owners to exert control over the crawling and indexing process, offering several valuable advantages such as SEO Optimization and Traffic Control. Robots.txt allows you to prioritize which pages you want search engines to focus on.

This can enhance your website's SEO (Search Engine Optimization) efforts by ensuring that the most valuable and relevant content gets indexed and ranked prominently in search results. Furthermore, by specifying which parts of your site should or should not be crawled, you can manage the flow of search engine bot traffic. This helps prevent the bots from overwhelming your server with excessive requests, ensuring a smoother user experience for your visitors.

To access the robots.txt setting in the Medianova Cloud Panel, follow these steps:

  1. Start by clicking on 'CDN' in the left-hand menu, then select 'CDN Resources' from the submenu.

  2. From the list of CDN Resources, choose the one for which you wish to configure the robots.txt setting.

  3. Navigate to the 'Caching' tab within the selected CDN Resource.

  4. Within the 'Caching' tab, you'll find the robots.txt setting. Selecting “Enable” option activates the Medianova Robots.txt file, allowing indexing of all files.

  5. Alternatively, selecting the 'Origin' option activates the Robots.txt file located on the origin server for indexing all files."

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.