- Joined
- Sep 5, 2012
- Messages
- 2,612
I'm wondering, is it good practice to allow bots to crawl every single aspect of your website? I mean every single directory, I know you can modify robots.txt to prevent them from crawling certain folders. Is there a reason to disallow bots to access certain parts of your website? Will this affect SEO? I'm not too savvy on SEO so I'd like to hear your thoughts!