Duplicate Domain Listings Gone?
-
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain.
I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...?
EX of what I mean by "duplicate domain listings": (in case my wording is confusing here)
Search term "Product Item"
Pages ranking:
domain-one.com/product-item.html
domain-one.com/product-item-benefits.html
etc...
-
Interesting, thanks for your insight as always EGOL. Upon further research I have found a few double listings but they have been for specific software and the double listings are of the developer's domain. So that makes sense to me.
Either way it seems the algo is making exceptions for certain domains depending on keyword and their authority to the actual search term.
-
Based upon the topics that I watch, Google recently increased the domain diversity of the SERPs by cutting back on the number of double listings, triple listings, quad and etc. listings.
You can still get two or three pages showing on the first page of the SERPs but it seems to be a lot harder. I have never considered "keyword cannibalization" to be a problem, but am starting to see it for some of the keywords that I am after.
For my retail areas... Informative content is now dominating the SERPs.
-
Maybe.
I have the #1 position for a corner of the market, but I could not get a second page on the front page, seeing I had #1 I hade a second site and then had 1# and #2,soi made another, I now have 3 on first page, once you have #1, this seems they way to go.
-
Yeah I remember a long time ago they said they were going to do this and then on a few of my SERPs it never took effect. So I complained here and EGOL convinced me that "if I can't beat 'em, join 'em."
Well turned out I couldn't join 'em either but I hate the concept so that's okay.
Anyway, for months and months these domains have had duplicate page listings on page 1 and beyond and it's been killing me. Today they're gone. So perhaps they just turned the dial up on the algo?
-
Sometime ago google made a change where they did just this, tried to get more domains on the front page rather then many pages from the same domain.
This was a few years back, so not sure what you are seeing today, it may be that the domains were penalized some other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimising meta tags: How to write them perfectly without duplicating? Impact of using different keywords?
Hi friends, Generally most of the articles about tags are either title rag or header tags, but not about both. I would like to know how to write perfect title and header tags. How much they must be relevant and different? Can we use the same tags for title and H1? If we are planning to rank for different keywords, can that different keywords can be used? I'm really curious to see some interesting answers for this. Thanks
Algorithm Updates | | vtmoz0 -
Domain Authority Keeps Dropping & FRED
Hi Moz! I've seen a big drop in Domain Authority 31 > 22 recently. I need a plan of what to sort out first, here are the points I know we need to improve: Page Speed Quality content - guides, blogs, videos Better UX experience to improve page engagement Backlinks - quality earned links & improvement of presence on social media This is our site http://www.key.co.uk/en/key/ I am the only SEO, with a small content team - who only really work on adding new products to the site. Our dev team are in France and we can be restricted by them. But I'm worried & I need a plan of what to tackle first to help improve this. We also saw keywords drop out in March - I'm assuming after Fred, some keywords aren't ones I would worry about, but then some are - for example - http://www.key.co.uk/en/key/dollies-load-movers-door-skates this page ranked at position 6 for Dollies - now dropped out altogether. Any ideas are welcome - help 🙂
Algorithm Updates | | BeckyKey2 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Dublicate Content: Almost samt site on different domains
Hi, I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world. When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country. The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so. The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so). So my Questions are: Is this something that is bad for Google SEO? The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
Algorithm Updates | | KasperGJ
the business behind all these sites are somewhat big. Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites). Hope there is some experts here which can help. /Kasper0 -
Long term rankings drop after swapping primary domain
Hey...this is my first post on Moz so please go easy on me! I've recently been baffled by the ranking behavior of a domain I do SEO for. In short, the primary domain was "musashispicymayo.com". After several months of SEO efforts and a really solid PR run the site managed to run up to #1 for several target keywords. For the purposes of this question I'd like to focus on the term "spicy mayo". "Musashispicymayo.com" was steadily climbing for as far back as page 5 until it ultimately reached #1 rank on Google for "spicy mayo". We also had another domain "musashifoods.com" which was originally 301 redirecting to "Musashispicymayo.com". About 3 months ago (shortly after acquiring the top ranking) the client wanted to reverse the domains so we started using "musashifoods.com" as the primary and redirecting "musashispicymayo.com" to that. In summary:
Algorithm Updates | | Andy-Twizen
ORIGINALLY: musashifoods.com 301 redirect -> musashispicymayo.com
NOW: musashispicymayo.com 301 redirect -> musashifoods.com At the time of the swap I did the following: Redirected the domain using a 301 via htaccess (made sure "www" requests are forwarded too) Created a new Google analytics account / webmaster account for "musashifoods.com" Went into my old webmaster tools account and used the change of address tool In the new webmaster tools account i submitted a sitemap and requested a crawl of the new domain Ensured the new primary domain was properly configured and all pages had the correct urls in the source code Verified that Google has updated their index and "musashifoods.com" now shows in the results. Now of course musashispicymayo has the keyword in the domain but I find it hard to believe that that is what caused such a dramatic and swift drop in rankings. In fact a good portion of the backlinks actually point to "musashifoods.com"...Did I miss something else here? Does Google penalize you for reversing 301 redirects like that instead of just using a new domain altogether? Let me know if I can provide any additional info that would help clarify...any advice is greatly appreciated!0 -
Your search - site:domain.com - did not match any documents.
I've recently started work on a new clients website and done some preliminary work with on-page optimisation, and there is still plenty of work to be done and issues to resolve. They are ranking ok on Bing, but they are not getting any ranking on Google at all (except paid) - I tried the site:domain.com search and comes up with no results... so this confirms that something is going on with the google search rank! Can anyone shed light on what can cause this or why this would happen? My next step is to look at their webmaster tools (haven't had access yet), but if anyone has any tips to resolve this or where to look, that would be great! Thanks!
Algorithm Updates | | ElevateCreativeAU0 -
Does having a few URLs pointing to another url via 301 "create" duplicate content?
Hello! I have a few URLs all related to the same business sector. Can I point them all at my home domain or should I point them to different relevant content within it? Ioan
Algorithm Updates | | IoanSaid1 -
New registered domains
Always looking for easier ways to identify new clients needed SEO help. I wondered is it possible to find newly registered domains ? Could an api be made to pull out domains listed via date registered.
Algorithm Updates | | onlinemediadirect0