Lots of Duplicate Meta Tag Errors in Webmaster Tools – Solved with robots.txt file !
If you are using Webmaster Tools then you are aware of those HTML Improvement page which is showing lots of Duplicate meta description, Long meta description, and Duplicate Title tags errors and so on…
Removing these errors has always been a swampy task for the webmasters. Let me tell you these errors are very necessary to be removed and if its showing the count more than 500, then it will lower your reputation in Google.
If you are a WordPress user, then these errors can be controlled a lot using a robots.txt file in your public_html directory. Just create a file as I am telling you and you will get rid of most of these errors. Actually, robots.txt is a small file which resides in your public_html directory and is read by the search engine crawlers before crawling your site, so that they instantly know, what to crawl and what to avoid. In other words, this file acts as a firewall for your site, follow the steps given below to block severe areas of your WordPress install and thus protect your site from dangerous attacks and also reduce those duplicate meta tag errors, a lot:
- Create a file using a simple text editor in your computer and name it robots.txt
- cgi-bin Folder: This folder is used to store cgi scripts ran by your server and is almost present on every server under public_html directory. You should ban this from search engines. Type the lineDisallow: /cgi-bin/ to block this folder from getting indexed.
- wp-admin Folder: This is one of the important administration folder of WordPress, majority of administration files are here, so its very safe to disallow this too from crawlers. Type the line Disallow: /wp-admin/ to block this folder from getting indexed.
- wp-content Folder: This folder consists of all plugins, uploads, upgrades and other files. You should also block this folder. Type the line Disallow: /wp-content/ to block this folder from getting indexed.
- wp-includes Folder: This folder consists of various supportive files for WordPress and should not be indexed by the crawlers. Type the line Disallow: /wp-includes/ to block this folder from getting indexed.
- feed: Normally this area of your site won’t be so important for the search results, if you are not interested for a crawler to index your feeds area then block this using the line Disallow: /feed/
- trackback: Trackbacks also show a lot of error messages in webmaster tools, block them also using Disallow: /trackback/
- tag: Most of the errors in webmaster tools, come from the tag area of WordPress site. I strongly recommend you to block this using the lineDisallow: /tag/
- author: If your site is having only one author then you should block author’s section also by using Disallow: /author/
- categories: If you are not using categories then you can block categories from being indexed using the line, Disallow: /category/
The above 10 lines will remove majority of the errors in Webmaster tools, but remember one thing there are things which are not to be blocked, Type these lines on top of your robots.txt file:
and if you are using a sitemap then type this line on the very last line of the file:
Sitemap: http://(domain name)/sitemap.xml
After creating the file save it and then upload it to your public_html directory using a FTP manager.
The above solution has worked for many webmaster who are using WordPress as their CMS and by watching the above lines you can use robots.txt for other CMS’s also.