Duplicate content can throw a wrench into any website’s search engine optimization (SEO) strategy, which means your site suffers from poor rankings on search engine results pages (SERPs) and lost traffic. Find out what duplicate content is and how you can tackle this problem.
What Is Duplicate Content?
Image via Flickr by Jeremy Brooks
You’ve come to the right place! Click to get started!
Duplicate content refers to sections of text that appear on multiple pages on your website or on a variety of domains. In order to be considered duplicate content, these content blocks can be completely identical or just largely the same.
Why Is Duplicate Content a Problem?
Repeated content presents a serious problem for search engines, which strive to drop duplicate content from their results. That means they have to make a split-second decision about which version of the content to feature.
Search engines typically consider the strength of the sites, review the timestamp on the content to confirm which website posted it first, and weigh the length of the duplicated text against the amount of content on the entire site. Ultimately, the site that ranks retains its spot on the SERP, but the pages with duplicate content won’t rank and stay at a near-zero weight.
What Types of Duplicate Content Are Harmful?
Not all duplicate content is bad, but some is, so it’s a problem you should address. Google considers duplicate content harmful if it’s repeated across domains for the express purpose of manipulating search engines and attempting to increase page rank. Since this compromises search engines’ abilities to offer a good user experience, Google typically punishes this behavior.
Is Any Kind of Duplicate Content Okay?
Many instances of duplicate content aren’t malicious, and you can prevent them from harming your SEO if you handle them appropriately. For instance, mobile websites, translated copy, printer-friendly pages, paginated comment pages, and multiple session IDs cause benign instances of duplicate copy, as does syndicated content, standard product copy, and URL parameters. Just because this duplicate content isn’t deliberately harmful doesn’t mean you’re off the hook, though.
Best Practices for Tackling Duplicate Content
Search engines might be smart, but they don’t know if you created identical content by accident or on purpose. To avoid SEO penalties, seek out common duplicate content issues and address them as soon as possible.
Use Duplication Checkers
If you’re not sure whether your site has harmful repeated content, use a duplication checker. Google Search Console typically alerts you to duplicate content, and Google Webmaster Tools helps you find repeated page titles and descriptions.
You can also complete a more intensive duplication check on your site with a tool like Screaming Frog SEO Spider. This audit tool crawls your website and identifies repeated URLs, page titles, and meta descriptions.
Set Your Preferred Domain
It might seem minor, but to search engines, domain mix-ups are a big deal. Your website’s URL can include a www or not, but you can’t expect SERPs to show it both ways since this creates duplicate content. Decide whether your site should appear with or without a www, and set your preferred domain.
You can easily set your preferred domain in Google Search Console, but keep in mind that this only affects Google searches. To set your preferred domain for all search engines, change it in your website’s dashboard or ask your webmaster to adjust the settings.
Add a Canonical Tag
If your site uses a standard content management system (CMS), you organize material using tags and categories. Sometimes that results in unintentionally duplicating content within search results, even though the content doesn’t really exist in two different places on your site.
This problem also arises when your site uses URL parameters for analytics or link tracking. These don’t alter the page content, but they do create the illusion of duplicate content.
To get around this issue, establish a canonical URL. You can do this by adding a rel=canonical tag to give credit to the primary page. You can also update parameters in Google Webmaster Tools.
Put Redirects in Place
Another way to reduce unintentional duplications is to set up 301 redirects. Not only does this method essentially combine multiple pages into one, but it also helps clean up any site structure issues. A 301 redirect can also help boost your SEO since it prevents duplicate pages from competing with each other.
Depending on your site setup, you might have a few options for creating 301 redirects. Set them up through your host or use a plugin, and test for broken links to make sure that the redirects work properly.
Replace and Update Content
A top-ranking site should include nothing but solid, original content. If you sell products from either independent makers or large manufacturers, however, there’s a good chance your product copy isn’t unique. Countless other websites could do the same, which makes it tough for yours to rank.
Rather than simply repeating standard content like product descriptions, give all of your pages the attention they deserve. Take the time to create distinct content in your brand’s own voice.
If your site has dozens or even hundreds of replicated pieces, replacing and updating them on such a large scale is a daunting task. Hire an experienced agency to create great content quickly so you can resolve the issue immediately.
Credit Syndicated Content
Whether your website syndicates content or your brand produces content for other sites, the duplicate content is cause for concern. No matter if you’re the creator or the publisher, make sure that the syndicating website follows best practices.
The syndicating website should include a mention of the author and a link back to the original content to avoid penalty. Don’t hesitate to use a rel=canonical tag in the link to point search engines in the right direction and make sure that the original author receives proper credit.
Don’t let duplicate content harm your SEO or deprive you of site traffic. Use these tips to clean up duplicate content issues, and keep your site at the top of the SERP.