The robots.txt is simply a file that contains rules used by websites to interact with web crawlers and robots. The file defines how to inform the web robot about which areas of the website should not be processed or scanned.
Weebly websites as well automatically include a robots.txt. Anyone can use it to control search engine indexing for specific pages or your entire website. You can view your robots file by going to www.yourdomain.com/robots.txt or yoursitehere.weebly.com/robots.txt.
Working with your Weebly Robot.txt file
Now working with your Weebly robot.txt file will give you more control over your website. You may decide for example to prevent your entire website or just some specific pages from being indexed by search engines. The default setting is to allow search engines to index your entire site. Below some very useful tutorials to play with these settings: