Duplicate Domain Listings Gone?
-
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain.
I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...?
EX of what I mean by "duplicate domain listings": (in case my wording is confusing here)
Search term "Product Item"
Pages ranking:
domain-one.com/product-item.html
domain-one.com/product-item-benefits.html
etc...
-
Interesting, thanks for your insight as always EGOL. Upon further research I have found a few double listings but they have been for specific software and the double listings are of the developer's domain. So that makes sense to me.
Either way it seems the algo is making exceptions for certain domains depending on keyword and their authority to the actual search term.
-
Based upon the topics that I watch, Google recently increased the domain diversity of the SERPs by cutting back on the number of double listings, triple listings, quad and etc. listings.
You can still get two or three pages showing on the first page of the SERPs but it seems to be a lot harder. I have never considered "keyword cannibalization" to be a problem, but am starting to see it for some of the keywords that I am after.
For my retail areas... Informative content is now dominating the SERPs.
-
Maybe.
I have the #1 position for a corner of the market, but I could not get a second page on the front page, seeing I had #1 I hade a second site and then had 1# and #2,soi made another, I now have 3 on first page, once you have #1, this seems they way to go.
-
Yeah I remember a long time ago they said they were going to do this and then on a few of my SERPs it never took effect. So I complained here and EGOL convinced me that "if I can't beat 'em, join 'em."
Well turned out I couldn't join 'em either but I hate the concept so that's okay.
Anyway, for months and months these domains have had duplicate page listings on page 1 and beyond and it's been killing me. Today they're gone. So perhaps they just turned the dial up on the algo?
-
Sometime ago google made a change where they did just this, tried to get more domains on the front page rather then many pages from the same domain.
This was a few years back, so not sure what you are seeing today, it may be that the domains were penalized some other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Importance on usability issues in sub directories or sub domains?
Hi Moz community, As the different usability issues like pagespeed or mobile responsiveness are playing a key role in website rankings; I wonder how much the same factors are important for sub directories or sub domain pages? Do each and every page of sub directory or sub domain must be optimised like website pages? Does Google gives same importance? Thanks
Algorithm Updates | | vtmoz0 -
Any recent updates from Google or community on sub domains vs sub directories?
Hi all, This has been a debate for years and I have noticed most of the SEOs suggest to go or switch to sub directories instead of sub domains. Still is this the same or any new updates from Google or SEO community? We have moved a sub domain to sub directory last year. The result was sub directory content started ranking good; but no change in website rankings. Because of moving sub domains to sub directories, will the linkjuice/PR of the website gets diluted as the number of pages increases which will takeaway soe authority? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Links from high Domain authority sites
I have a relatively uncompetitive niche ranking around number 6 for my keywords. Would getting a few links from some Moz DA 80-90 and DA 90-100 sites help my rankings a lot? Some of the pages linking to me from these sites might be deep in the site pretty far away from the home page with pagerank of "unranked" or a grayed out bar and these pages linking to me might not have many links at all other than from the internal links of the site itself and would have a Moz PA of 10 or 20. Would these pass much pagerank or authority to my site or would they not be worth going after? These links to my site would be in context on a blog. Thanks mozzers!
Algorithm Updates | | Ron100 -
What is the importance of listing score at Getlisted.org regarding higher local 7 pack rankings ?
What is the importance of listing score at Getlisted.org regarding higher local 7 pack rankings ? My listing score is 60% and i am trying to improve it. I am on 2nd page on local map results. So maximum score will help to get place in 7 pack ?
Algorithm Updates | | mnkpso0 -
Is it allowed to put a word in all domains URLs to get higher in SERP?
Hello, What good or bad could happen if someone put the same keyword in all site's URL's? (i.e. I would be selling cars and my domain isn't included any word cars, so i put all of my pages in one folder like domain.com/cheap-cars/etc)
Algorithm Updates | | komeksimas0 -
Double Listings On Page One
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page. I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift. Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this". BTW - This is not effecting any of my Brand SERPs.
Algorithm Updates | | BenRWoodard0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1