Welcome to Admin Junkies, Guest — join our community!

Register or log in to explore all our content and services for free on Admin Junkies.

Should you allow bots to crawl every url on your site?

Ravenfreak

Service Team
Service Team
Joined
Oct 16, 2022
Messages
2,319
Credits
10,370
I'm wondering, is it good practice to allow bots to crawl every single aspect of your website? I mean every single directory, I know you can modify robots.txt to prevent them from crawling certain folders. Is there a reason to disallow bots to access certain parts of your website? Will this affect SEO? I'm not too savvy on SEO so I'd like to hear your thoughts!
 
The exact wisdom over the years has changed but the broader answer is 'not really' - you want search engines to focus on the things that are of value, the actual content etc. and there's plenty of areas that will not be good content, not good for SEO and these can be carved out with robots.txt to encourage it to look at the things you do want it to look at.

But robots.txt is a recommendation not a hard limit - if you really don't want search engines going there, don't make it visible to guests at all.
 

Log in or register to unlock full forum benefits!

Log in or register to unlock full forum benefits!

Register

Register on Admin Junkies completely free.

Register now
Log in

If you have an account, please log in

Log in
Activity
So far there's no one here

Users who are viewing this thread

Would You Rather #9

  • Start a forum in a popular but highly competitive niche

    Votes: 5 21.7%
  • Initiate a forum within a limited-known niche with zero competition

    Votes: 18 78.3%
Win this space by entering the Website of The Month Contest

Theme editor

Theme customizations

Graphic Backgrounds

Granite Backgrounds