Duplicate Homepage - How to fix?
-
Hi Everyone,
I've tried using BeamUsUp SEO Crawler and have found one warning and two errors on our site.
The warning is for a duplicate meta description, and the errors are a duplicate page and a duplicate title.
For each problem it's showing the same two pages as the source of the error, but one has a slash at the end and one doesn't. They're both for the homepage.
And
Has anyone seen this before? Does anyone know if this is anything we should worry about?
-
Moz was warning me about thin content and I got the idea that it was because of "/" from
which said zero unique phrases
maybe it's because i have "Home" and "Home_master" although the Home_master has eye crossed out (shouldnt be visible to goog)
-
My answer above applies specifically to your situation too, @Elchanan. You are not being penalised by Google for duplicate content in your example.
As I mentioned, the root of a domain is a special case. The version with the slash at the end is considered the same URL as the one without the slash by both browsers and search engines. So much so that it is impossible to redirect from one to the other because that creates a redirect loop - the page is redirecting to itself.
If that SEO tool has warned you of that issue, it's because the tool isn't properly programmed to handle this unique situation correctly.
Hope that helps.
Paul
-
I'm having a similar problem with my wix site. I am being penalised by google for duplicate content of "my-site.com/" and "my-site.com"
i tried 301 redirect in wix but it doesnt seem to help according to https://datayze.com/thin-content-checker.php
Could someone help me
-
Thanks for the great answer, Paul, that's very helpful.
-
This is an incorrect implementation in the BeamUsUp tool. The hostname (the basic root URL) is a special case. Both the version with the ending slash and without the ending slash are considered by browsers and search engines to be exactly the same.
In fact, you cannot redirect one to the other. Because the browser is programmed to consider them the same, you'll create an infinite loop. So not only is there nothing you should do, there's nothing you can do.
This is the only case where this is true though! For all other internal URLs. the version with the slash is considered to be a completely different URL than the one without the slash. So unless you redirect one version to the other for internal pages, you'll have duplicate content issues.
Hope that helps.
Paul
-
Hi there,
If you 301 redirect one version of the url to the other (whichever version matches the rest of your urls), then it should solve this duplicate content issue.
thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Duplicated privacy policy pages
I work for a small web agency and I noticed that many of the sites that we build have been using the same privacy policy. Obviously it can be a bit of a nightmare to write a unique privacy policy for each client so is Google likely to class this as duplicate content and result in a penalty? They must realise that privacy policies are likely to be the same or very similar as most legal writing tends to be! I can block the content in robots.txt or meta no-index it if necesarry but I just wanted to get some feedback to see if this is necessary!
Intermediate & Advanced SEO | | Jamie.Stevens1 -
Subpage ranking for homepage keyword
Hi all, May seem like a simple scenario and I might be missing something, but my subpage seems to be ranking for my main homepage keyword. The subpage PR is 28 and my domain authority is 17, how can I get my main home page to rank instead of the sub page (product page)? I want to stay away from exact match anchor text links, any suggestions?
Intermediate & Advanced SEO | | SO_UK0 -
Duplicate Titles caused by blog
Hey I've done some research and understand the canonical tags and rel prev and rel next, but I wanted to get someones opinion on if we needed it since the articles are somewhat independent of each in content (there's a focus on both banks and accountants) We have over 68 pages of blog materials http://www.sageworks.com/blog/default.aspx?page=7 through http://www.sageworks.com/blog/default.aspx?page=68 Thanks in advance for your help!
Intermediate & Advanced SEO | | josh1230 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
Links from Duplicate C Class IP
Would you get links from duplicate class c IP"s? The thing is, the IP address numerically appear like they are far enough apart, but the tool says they are duplicate IP"s ? Can someone shed some light on me as far as Duplicate Class C IP"s and linkbuilding utilizing blog networks is concerned??
Intermediate & Advanced SEO | | Alick3000 -
Blog posts, blog archives and duplication
Just reviewed a blog integrated with my website, and have noticed duplicate content - the blog homepage includes blogpost summaries (not a major issue as now set up so only put in opening paragraphy then anchor text to full blog post). Then that's a full blog blog post if you click for more - then that's carbon copied over in the archive. So one near exact duplicate. Is this something worth taking action on with nocrawl tags, etc., on archive duplicates of blog posts, or shouldn't I be to hung-up on this? I'm a scientist by training, so tend to go further and further once I get going...
Intermediate & Advanced SEO | | McTaggart0