Product Subdomain Outranking "Marketing" Domains
-
Hello, Moz community!
I have been puzzling about what to do for a client. Here is the challenge.
The client's "product"/welcome page lives at
this page allows the visitor to select the country/informational site they want OR to login to their subdomain/install of the product.
Google is choosing this www.client.com url as the main result for client brand searches.
In a perfect world, for searchers in the US, we would get served the client's US version of the information/marketing site which lives at https://client.com/us, and so on for other country level content (also living in a directory for that country)
It's a brand new client, we've done geo-targeting within the search console, and I'm kind of scared to rock the boat by de-indexing this www.client.com welcome screen.
Any thoughts, ideas, potential solutions are so appreciated.
THANKS!
Thank you!
-
thanks! Such a great answer.
-
You are very right to be worried about rocking that particular boat. If you de-index a page, it basically nullifies its SEO authority. Since the page which you would nullify, is a homepage-level URL (you gave the example 'www.client.com') then this would basically be SEOicide
Most other pages on your site, probably get most of their SEO authority and ranking power from your homepage (directly or indirectly, e.g: homepage linking to sub-page vs homepage linking to category, which then links to sub-page)
This is because, it's almost certain that your homepage will be the URL which has gained the most links from across the web. People are lazy, they just pick the shortest URL when linking. I'm not saying you don't have good deeplinks, just that 'most' of the good ones are probably hitting the homepage
So if you nullify the homepage's right to hold SEO authority, what happens to everything underneath the homepage? Are you imagining an avalanche right now? That's right, this would be one of the worst possible ideas in the universe. Write it down, print it out and burn it
Search-console level geo-targeting is for whole sites, not pages or (usually, though there can be exceptions) sections - you know that right? What that does is tell Google which country you want the website (the whole property which you have selected) to rank in. It basically stops that property from ranking well globally and gives minor boosts in the location which has been selected. If you just took your homepage level property and told it that it's US now, prepare to kiss most of your other traffic goodbye (hard lesson). If you were semi-smart and added /US/ as a separate property, and only set the geo targeting to US for that property - breathe a sigh of relief. It likely won't solve your issue but it won't be a complete catastrophe either (phew!)
Really the only decent tool you have to direct Google to rank individual web pages for regions and / or languages is the hreflang tag. These tags tell Google: "hey, you landed on me and I'm a valid page. But if you want to see versions of me in other languages - go to these other URLs through my hreflang links". Hreflangs only work if they are mutually agreed (both pages contain mirrored hreflangs to each other, and both pages do NOT give multiple URLs for a single language / location combination - or language / location in isolation)
The problem is, even if you do everything right - Google really has to believe "yes, this other page is another version of exactly the same page I'm looking at right now". Google can do stuff like, take the main content of both URLs, put it into a single string, then check the Boolean string similarity of both content strings to find the 'percentage' of the content's similarity. Well, this is how I check content similarity - Google does something similar, but probably infinitely more elegant and clever. In the case of hreflangs string translation is probably also enacted
If Google's mechanical mind, thinks that the pages are very different - then it will simply ignore the hreflang (just like Google will not pass SEO authority through a 301 redirect, if the contents of the old and new page are highly dissimilar in machine terms)
This is a fail-safe that Google has, to stop people from moving high rankings on 'useful' or 'proven' (via hyperlinks) URLs (content) - onto less useful, or less proven pages (which by Google's logic, if the content is very different, should have to re-prove their worth). Remember, what a human thinks is similar is irrelevant here. You need to focus on what a machine would find similar (can be VERY different things there)
So even if you do it all properly and use hreflangs, since the nature of the pages is very different (one is functional, helps users navigate, log-in and download something - that's very useful; whilst the other is selly, marketing content is usually thin) - it's unlikely that Google will swallow your intended URL serves
You'd be better off making the homepage include some marketing elements and making the marketing URLs include some of the functional elements. If both pages do both things well and are essentially the same, then hreflangs might actually start to work
If you want to keep the marketing URLs pure sell, fine - but they will only be useful as paid traffic landing pages (like from Google Ads, Pinterest Ads or FaceBook ads) where you can connect your ad to the advertorial (marketing) URLs. People expect ads to land on marketing-centric pages. People don't expect (or necessarily want) that for just regular web searches. The channel (SEO) is called 'organic' for a reason!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forwarding a .org domain to a .com domain: any negative impact to consider?
Hello! I have a question I've been unable to find a clear answer to. My client's primary domain is a .com with a satisfactorily high DA. My client owns the .org version of its domain (which has a very low DA, I suppose due to inactivity) but has never forwarded it on. For branding/visibility/traffic reasons, I'd like to recommend they set up the .org domain to forward to the .com domain, but I wanted to ask a few questions first: 1. Does forwarding low-value DA domains to high-value DA domains have any negative authority/SEO impact? 2. If the .org domain was to be forwarded, am I correct that an SSL cert is not necessary for it if the .com domain has an SSL cert? Thanks in advance!
Technical SEO | | mollykathariner_ms1 -
Handling "legitimate" duplicate content in an online shop.
The scenario: Online shop selling consumables for machinery. Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with. The Problem: Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z. Machine A page = Machine B page = Consumables range A page Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc). If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic. However, if I don't use canonical tags then all the pages get slammed as duplicate content. For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format. However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same. The Question: What is the best way to handle this from both a user and search engine perspective?
Technical SEO | | Serpstone0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
How can I change the page title "two" (artigos/page/2.html) in each category ?
I have some categories and photo galleries that have more than one page (i.e.: http://www.buffetdomicilio.com/category/artigos and http://www.buffetdomicilio.com/category/artigos/page/2). I think that I must change the tittle and description, but I don't how. I would like to know how can I change the title of each of them without stay with duplicate title and description. Thank you! ahcAORR.jpg
Technical SEO | | otimizador20130 -
Blog is outranking my root domain - Why and Help
I have an odd question concerning SERP rankings on Google along with PageRank and Google Trust. I have a website created in HTML with a WP blog created in a sub-directory. I am finding that the blog is getting relatively good rankings yet it seems the root domain is not getting any of the rank or trust passed onto it as it consistently is outranked by the blog. In fact, for a few of our targeted keywords the root domain does not even show up at all, yet the blog does. Our blog is #2 in Google SERPs, a top Google Place ranking, a Youtube video in the top 10 and a Google Ad place somewhere in the mix of first page ads. The root domain is nowhere to be found. Is there something I need to correct either on the site, in Google Webmaster or in WP Settings that I am not aware that is pushing all the Google Rank and Trust to the blog instead of the other way around? I considered changing the WP Site URL to the root domain thinking it will pass more SEO mojo to the root. But I don't think this is the way to go. I am kind of at a loss, any help is appreciated.
Technical SEO | | SEO-Novice0 -
Subdomains
Hello Seo Experts, Can any one help me with this issue... I do have issues with my subdomains, My site name is http://www.bharatdesi.com, should I have subdomain http://www.bharatdesi.com/hyderabad this way or hyderabad.bharatdesi.com. Please any can answer my question, which way I have to organize my subdomains... and also give me some examples.. Thank you.
Technical SEO | | Vinss0 -
Hyphenated Domain Names - "Spammy" or Not?
Some say hyphenated domain names are "spammy". I have also noticed that Moz's On Page Keyword Tool does NOT recognize keywords in a non-hyphenated domain name. So one would assume neither do the bots. I noticed obviously misleading words like car in carnival or spa in space or spatula, etc embedded in domain names and pondered the effect. I took it a step further with non-hyphenated domain names. I experimented by selecting totally random three or four letter blocks - Example: randomfactgenerator.net - rand omf act gene rator Each one of those clips returns copious results AND the On-Page Report Card does not credit the domain name as containing "random facts" as keywords**,** whereas www.business-sales-sarasota.com does get credit for "business sales sarasota" in the URL. This seems an obvious situation - unhyphenated domains can scramble the keywords and confuse the bots, as they search all possible combinations. YES - I know the content should carry it but - I do not believe domain names are irrelevant, as many say. I don't believe that hyphenated domain names are not more efficient than non hyphenated ones - as long as you don't overdo it. I have also seen where a weak site in an easy market will quickly top the list because the hyphenated domain name matches the search term - I have done it (in my pre Seo Moz days) with ft-myers-auto-air.com. I built the site in a couple of days and in a couple weeks it was on page one. Any thoughts on this?
Technical SEO | | dcmike0 -
Redirecting root domains to sub domains
Mozzers: We have a instance where a client is looking to 301 a www.example.com to www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean
Technical SEO | | Yozzer0