Duplicate Domain Listings Gone?
-
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain.
I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...?
EX of what I mean by "duplicate domain listings": (in case my wording is confusing here)
Search term "Product Item"
Pages ranking:
domain-one.com/product-item.html
domain-one.com/product-item-benefits.html
etc...
-
Interesting, thanks for your insight as always EGOL. Upon further research I have found a few double listings but they have been for specific software and the double listings are of the developer's domain. So that makes sense to me.
Either way it seems the algo is making exceptions for certain domains depending on keyword and their authority to the actual search term.
-
Based upon the topics that I watch, Google recently increased the domain diversity of the SERPs by cutting back on the number of double listings, triple listings, quad and etc. listings.
You can still get two or three pages showing on the first page of the SERPs but it seems to be a lot harder. I have never considered "keyword cannibalization" to be a problem, but am starting to see it for some of the keywords that I am after.
For my retail areas... Informative content is now dominating the SERPs.
-
Maybe.
I have the #1 position for a corner of the market, but I could not get a second page on the front page, seeing I had #1 I hade a second site and then had 1# and #2,soi made another, I now have 3 on first page, once you have #1, this seems they way to go.
-
Yeah I remember a long time ago they said they were going to do this and then on a few of my SERPs it never took effect. So I complained here and EGOL convinced me that "if I can't beat 'em, join 'em."
Well turned out I couldn't join 'em either but I hate the concept so that's okay.
Anyway, for months and months these domains have had duplicate page listings on page 1 and beyond and it's been killing me. Today they're gone. So perhaps they just turned the dial up on the algo?
-
Sometime ago google made a change where they did just this, tried to get more domains on the front page rather then many pages from the same domain.
This was a few years back, so not sure what you are seeing today, it may be that the domains were penalized some other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimising meta tags: How to write them perfectly without duplicating? Impact of using different keywords?
Hi friends, Generally most of the articles about tags are either title rag or header tags, but not about both. I would like to know how to write perfect title and header tags. How much they must be relevant and different? Can we use the same tags for title and H1? If we are planning to rank for different keywords, can that different keywords can be used? I'm really curious to see some interesting answers for this. Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
Not showing in Google map listing. Why?
We have a client who's law firm is the highest google reviewed, on page two or three of St. Louis personal injury lawyer, but does not show in the map listing. Any ideas why this would happen or how to ensure they are viewable in the map listing?
Algorithm Updates | | David-Kley0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Blog in Sub Domain or Sub Folder. Pros and Cons
My client wants to show more links on google once someone searches for his company name. Im wondering whats the best way to setup a blog to solve this. Should I create it as blog.website.com.au or website.com.au/blog? What are the pros and cons of each? Thanks heaps guys.
Algorithm Updates | | Uds0 -
Low Domain Authority - Rank Well For Competitive Keywords
I have been following a competitor's link profile on OSE for over 8 months. Their linkbacks have remained the same (3 follow, 9 nofollow links), all from low-quality directory sites. However, my competitor continues to improve in rankings and is now #1 for competitive keyword searches. How is this possible? Is there a way to hide your link profile or links from OSE? Any tips are appreciated - Thanks!
Algorithm Updates | | TheSEODR0