Service Team
Service Team
I'm wondering, is it good practice to allow bots to crawl every single aspect of your website? I mean every single directory, I know you can modify robots.txt to prevent them from crawling certain folders. Is there a reason to disallow bots to access certain parts of your website? Will this affect SEO? I'm not too savvy on SEO so I'd like to hear your thoughts!
Cranky Curmudgeon
Silver Member
MOTM
The exact wisdom over the years has changed but the broader answer is 'not really' - you want search engines to focus on the things that are of value, the actual content etc. and there's plenty of areas that will not be good content, not good for SEO and these can be carved out with robots.txt to encourage it to look at the things you do want it to look at.
But robots.txt is a recommendation not a hard limit - if you really don't want search engines going there, don't make it visible to guests at all.
Log in or register to unlock full forum benefits!
Log in or register to unlock full forum benefits!
Register
Register on Admin Junkies completely free.
Register now
Log in
If you have an account, please log in
Log in
Activity
So far there's no one here