WordPress sites usually have multiple links to the same content. Recent googlebot changes have now been more sensitive to this issue and will highlight this as “Duplicate Title Tag”. Once this is flagged, your page that is highlighted as having this “Duplicate Title Tag” will be delisted from the search engine results.
The effect is that your site, or at least your pages with duplicate title tags will not be shown. And once that happens, your traffic will drop as the visitors referred by the search engines will be gone!
The easy solution is to use the robots.txt file to let google know about this and have it not index the duplicate pages. For example:
eBusiness Adviser | Network Monitoring Service
/business-process/network-health-monitoring-service/
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /*?*
Disallow: /*?
Disallow: /*page/*User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-contents/plugins/
Disallow: /wp-contents/themes/
Disallow: /trackback
Disallow: /comments
Disallow: /feed
Here are some great ideas on how to make wordpress blog duplicate content safe.