0 321

Did you know that through robots.txt you can control which post or page of your site to crawl or not to crawl? and if you’re currently worried about duplicate content on your site, well not a problem folks, robots.txt is also helpful to avoid duplicate content, yes it is possible to avoid duplicate content via robots.txt by not crawling site directories, like tag, archive and category these can cause duplicate content.

Implementing good SEO robots.txt will help you get high traffic in most search engines or even receive more paying in relevant ads.


Copy and paste the below snippet into robots.txt of your site.

header.php trick

Add this snippet in header.php of your current theme, we simply add conditional statement which robots to display in our blog.

Don’t want to get your hand dirty?

Don’t worry folks as always I will provide plugin from other developer, if there’s any 🙂
KB Robots.txt

You might also like More from author

Leave A Reply

Your email address will not be published.