The robots.txt is simply a file that contains rules used by websites to interact with web crawlers and robots. The file defines how to inform the web robot about which areas of the website should not be processed or scanned.

Weebly websites as well automatically include a robots.txt. Anyone can use it to control search engine indexing for specific pages or your entire website. You can view your robots file by going to or

Working with your Weebly Robot.txt file

Now working with your Weebly robot.txt file will give you more control over your website. You may decide for example to prevent your entire website or just some specific pages from being indexed by search engines. The default setting is to allow search engines to index your entire site. Below some very useful tutorials to play with these settings:

How to Hide a Weebly Page from Google

How to Stop Search Engines from Crawling a Weebly Site

If you liked this article, then you may consider subscribing to our WeeblyTutorials YouTube Channel for Weebly video tutorials. You can also find us on Twitter and Facebook.