Weebly and the Robots.txt File

Home/Tutorials, Weebly Video Tutorials/Weebly and the Robots.txt File

Weebly and the Robots.txt File

The robots.txt is simply a file that contains rules used by websites to interact with web crawlers and robots. The file defines how to inform the web robot about which areas of the website should not be processed or scanned.

Weebly websites as well automatically include a robots.txt. Anyone can use it to control search engine indexing for specific pages or your entire website. You can view your robots file by going to www.yourdomain.com/robots.txt or yoursitehere.weebly.com/robots.txt.



Working with your Weebly Robot.txt file

Now working with your Weebly robot.txt file will give you more control over your website. You may decide for example to prevent your entire website or just some specific pages from being indexed by search engines. The default setting is to allow search engines to index your entire site. Below some very useful tutorials to play with these settings:

How to Hide a Weebly Page from Google

How to Stop Search Engines from Crawling a Weebly Site

If you liked this article, then you may consider subscribing to our WeeblyTutorials YouTube Channel for Weebly video tutorials. You can also find us on Twitter and Facebook.

By | 2017-09-11T01:29:59+00:00 September 8th, 2017|Tutorials, Weebly Video Tutorials|0 Comments

Your time was just wasted by:

Editorial Staff at WeeblyTutorials is a team of Weebly experts.

Leave A Comment